This project has moved and is read-only. For the latest updates, please go here.

Having 835 parsing issue

Sep 12, 2012 at 11:08 PM

Hi, this is a great library.  I've used it to parse over 30,000 835 files but then I sterted getting files like below.  The parser does not throw any exceptions but the resulting XML only has an Interchange node containing a single complete ISA node.  The file consists of 80 character rows with a line feed at the end of each.  Do you have any clue to what is happening?  I would like to continue using this library due to its simplicity and incredible speed.

Thank you very much!

835 Input File:

ISA*00*          *00*          *ZZ*ANTHEM         *ZZ*FBKBLUECROSS   *111101*180
99999~PER*CX**TE*9999999999~N1*PE*XXXXXXX XX XXXXXX XXXXXX*XX*9999999999~N3*999
X 9XX XX~N4*XXXXXXXXX*KY*99999~REF*TJ*999999999~LX*1~CLP*999999999*1*9999.99*999

835 XML:

<?xml version="1.0"?>
<Interchange segment-terminator=":" element-separator="*" sub-element-separator="*">
    <!--Author Information Qualifier-->
    <ISA01>00<!--No Authorization Information Present--></ISA01>
    <!--Author Information-->
    <ISA02>          </ISA02>
    <!--Security Information Qualifer-->
    <ISA03>00<!--No Security Information Present--></ISA03>
    <!--Security Information-->
    <ISA04>          </ISA04>
    <!--Interchange ID Qualifier-->
    <ISA05>ZZ<!--Mutually Defined--></ISA05>
    <!--Interchange Sender ID-->
    <ISA06>ANTHEM         </ISA06>
    <!--Interchange ID Qualifier-->
    <ISA07>ZZ<!--Mutually Defined--></ISA07>
    <!--Interchange Receiver ID-->
    <!--Interchange Date-->
    <!--Interchange Time-->
    <!--Inter Control Standards Identifier-->
    <ISA11>U<!--U.S. EDI Community of ASC X12, TDCC, and UCS--></ISA11>
    <!--Inter Control Version Number-->
    <!--Inter Control Number-->
    <!--Acknowlegment Requested-->
    <ISA14>0<!--No Acknowledgment Requested--></ISA14>
    <!--Usage Indicator-->
    <ISA15>P<!--Production Data--></ISA15>
    <ISA16 />

Sep 12, 2012 at 11:52 PM

From the xml it looks like it is looking the : to be the segment-terminator when the file is using ~ as the segment terminator.  There appears to be one extra character in your ISA segment which is pushing the : into the wrong position.

Copy and paste this ISA next to the ISA segment in one of your other files so you can see them on top of each other and you should be able to find the offending element.  The ISA segment is special in that it is expected to be fixed width so all the delimiter characters msut be in an absolute position in that segment.

Sep 13, 2012 at 12:32 AM
Edited Sep 13, 2012 at 12:37 AM

Maybe it is the line feed character.  When I take the line feed out the ~ is in column 106. I was already thinking about having some code to detect this format and remove the line feeds before sending it to the X12Parser code.  I hesitated because EDIFileEditor did not seem to complain about it at all.  I'll manually remove the line feeds and re-run the file to see what happens. EDIT: Removing the line feeds resulted in successful parsing!

Thanks for the quick reply.

P.S. I have another file that has TS3 segments without a preceding LX segment.  The code sees this as an error. I was wandering if you knew if TS3/TS2 segments are really supposed to be preceded by a LX segment?  Your "Ansi-835-5010Specification.xml" doc seems to show this but the "835 Healthcare Claim Payment - Advice 5010" guide I have is confusing to me since I am pretty new to 835 and X12 in general. I have only seen some examples in my guide where TS3 segments are shown preceded by a LX segment.

Sep 13, 2012 at 2:30 AM

The 5010 guide that I used to do the 835 spec expected TS2 and TS3 as optional segments in the LX loop (LoopId=2000).

When a segment appears within a loop definition than it is expected that it will appear after the starting segment of that loop.

Double check your x12 version in the GS08 element to see if it is some other version.  Make sure that is the version that is in your specification document.

You may need create your own SpecificationFinder if you are in need of a special configuration that isn't embedded in the assembly.

Sometimes it is necessary to add some segments to a custom specification if your trading partner is not responsive in fixing invalid x12 or is choosing to use a specification that diverges from the standard.

Sep 13, 2012 at 2:52 AM

The version is 005010X221A1 so this is why I suspected the file might be incorrect.  In the embedded specicification the usage for TS3 and TS2 are "Required".  Should these be set to "Situational"?

Current spec for the version of the code I have is:

    <StartingSegment SegmentId="LX" Usage="Situational" Repeat="1"/>
    <Segment SegmentId="TS3" Usage="Required" Repeat="1"/>
    <Segment SegmentId="TS2" Usage="Required" Repeat="1"/>

Thanks for being so responsive!  Your code rocks! I use this in a multi-threaded service and so far it can parse a couple of hundred files in way less than a minute.

Sep 13, 2012 at 1:20 PM

I have made the change to Situational for both the 4010 and 5010 spec, I don't think this will make a difference for you if you don't have an LX segment.

If you want to do a test, you can try and compile a local change where you add the TS2 and TS3 at the TransactionSpecification level to get your files to parse.  I looked around on the internet and I didn't see any companion guides that suggested the TS2 and TS3 could be used outside the LX loop.

Sep 13, 2012 at 4:52 PM

Changing the usage nor adding the TS2 and TS3 at the TransactionSpecification level really did anything.  Adding these at the TransactionSpecification level just moved the issue from the TS3 to the CLP segment next in line so the file is just not constructed properly.  I created a version of the original file with some fabricated LX segments ahead of the TS3 segments and it parsed perfectly.

I appreciate you taking the time to help me out and keep up the great work!  I think this is an excellent library and now I need to try and dig in to get an understanding of exactly how it works.  The XML output is very flexible but at the moment I am using SSIS to just load it into 30+ database tables.  Querying the data is not too bad unless you only have mediocre SQL skills:-)