ÿØÿà JFIF    ÿÛ „  ( %"1!%)+...383,7(-.+  -+++--++++---+-+-----+---------------+---+-++7-----ÿÀ  ß â" ÿÄ     ÿÄ H    !1AQaq"‘¡2B±ÁÑð#R“Ò Tbr‚²á3csƒ’ÂñDS¢³$CÿÄ   ÿÄ %  !1AQa"23‘ÿÚ   ? ôÿ ¨pŸªáÿ —åYõõ\?àÒü©ŠÄï¨pŸªáÿ —åYõõ\?àÓü©ŠÄá 0Ÿªáÿ Ÿå[úƒ ú®ði~TÁbqÐ8OÕpÿ ƒOò¤Oè`–RÂáœá™êi€ßÉ< FtŸI“öÌ8úDf´°å}“¾œ6  öFá°y¥jñÇh†ˆ¢ã/ÃÐ:ªcÈ "Y¡ðÑl>ÿ ”ÏËte:qž\oäŠe÷󲍷˜HT4&ÿ ÓÐü6ö®¿øþßèô Ÿ•7Ñi’•j|“ñì>b…þS?*Óôÿ ÓÐü*h¥£ír¶ü UãS炟[AÐaè[ûª•õ&õj?†Éö+EzP—WeÒírJFt ‘BŒ†Ï‡%#tE Øz ¥OÛ«!1›üä±Í™%ºÍãö]°î(–:@<‹ŒÊö×òÆt¦ãº+‡¦%ÌÁ²h´OƒJŒtMÜ>ÀÜÊw3Y´•牋4ǍýʏTì>œú=Íwhyë,¾Ôò×õ¿ßÊa»«þˆѪQ|%6ž™A õ%:øj<>É—ÿ Å_ˆCbõ¥š±ý¯Ýƒï…¶|RëócÍf溪“t.СøTÿ *Ä¿-{†çàczůŽ_–^XþŒ±miB[X±d 1,é”zEù»& î9gœf™9Ð'.;—™i}!ôšåîqêÛ٤ёý£½ÆA–àôe"A$˝Úsäÿ ÷Û #°xŸëí(l »ý3—¥5m! rt`†0~'j2(]S¦¦kv,ÚÇ l¦øJA£Šƒ J3E8ÙiŽ:cÉžúeZ°€¯\®kÖ(79«Ž:¯X”¾³Š&¡* ….‰Ž(ÜíŸ2¥ª‡×Hi²TF¤ò[¨íÈRëÉ䢍mgÑ.Ÿ<öäS0í„ǹÁU´f#Vß;Õ–…P@3ío<ä-±»Ž.L|kªÀê›fÂ6@»eu‚|ÓaÞÆŸ…¨ááå>åŠ?cKü6ùTÍÆ”†sĤÚ;H2RÚ†õ\Ö·Ÿn'¾ ñ#ºI¤Å´%çÁ­‚â7›‹qT3Iï¨ÖÚ5I7Ë!ÅOóŸ¶øÝñØôת¦$Tcö‘[«Ö³šÒ';Aþ ¸èíg A2Z"i¸vdÄ÷.iõ®§)¿]¤À†–‡É&ä{V¶iŽ”.Ó×Õÿ û?h¬Mt–íª[ÿ Ñÿ ÌV(í}=ibÔ¡›¥¢±b Lô¥‡piη_Z<‡z§èŒ)iÖwiÇ 2hÙ3·=’d÷8éŽ1¦¸c¤µ€7›7Ø ð\á)} ¹fËí›pAÃL%âc2 í§æQz¿;T8sæ°qø)QFMð‰XŒÂ±N¢aF¨…8¯!U  Z©RÊ ÖPVÄÀÍin™Ì-GˆªÅËŠ›•zË}º±ŽÍFò¹}Uw×#ä5B¤{î}Ð<ÙD é©¤&‡ïDbàÁôMÁ." ¤‡ú*õ'VŽ|¼´Úgllº¼klz[Æüï÷Aób‡Eÿ dÑ»Xx9ÃÜ£ÁT/`¼¸vI±Ýµ·Ë‚“G³þ*Ÿû´r|*}<¨îºœ @¦mÄ’M¹”.œ«Y–|6ÏU¤jç¥ÕÞqO ˜kDÆÁ¨5ÿ š;ÐЦ¦€GÙk \ –Þ=â¼=SͧµªS°ÚÍpÜãQűÀõ¬?ÃÁ1Ñ•õZà?hóœ€ L¦l{Y*K˜Ù›zc˜–ˆâ ø+¾ ­-Ök¥%ùEÜA'}ˆ><ÊIè“bpÍ/qÞâvoX€w,\úªò6Z[XdÒæ­@Ö—€$òJí#é>'°Ú ôª˜<)4ryÙ£|óAÅn5žêŸyÒäMÝ2{"}‰–¤l÷ûWX\l¾Á¸góÉOÔ /óñB¤f¸çñ[.P˜ZsÊË*ßT܈§QN¢’¡¨§V¼(Üù*eÕ“”5T¨‹Âê¥FŒã½Dü[8'Ò¥a…Ú¶k7a *•›¼'Ò·\8¨ª\@\õ¢¦íq+DÙrmÎ…_ªæ»ŠÓœ¡¯’Ré9MÅ×D™lælffc+ŒÑ,ý™ÿ ¯þǤ=Å’Á7µ÷ÚÛ/“Ü€ñýã¼àí¾ÕÑ+ƒ,uµMâÀÄbm:ÒÎPæ{˜Gz[ƒ¯«® KHà`ߨŠéí¯P8Aq.C‰ à€kòpj´kN¶qô€…Õ,ÜNŠª-­{Zö’æû44‰sŽè‰îVíRœÕm" 6?³D9¡ÇTíÅꋇ`4«¸ÝÁô ï’ýorqКÇZ«x4Žâéþuïf¹µö[P ,Q£éaX±`PÉÍZ ¸äYúg üAx ’6Lê‚xÝÓ*äQ  Ï’¨hÍ =²,6ï#rÃ<¯–£»ƒ‹,–ê•€ aÛsñ'%Æ"®ÛüìBᝠHÚ3ß°©$“XnœÖ’î2ËTeûìxîß ¦å¿çÉ ðK§þ{‘t‚Ϋ¬jéîZ[ ”š7L¥4VÚCE×]m¤Øy”ä4-dz£œ§¸x.*ãÊÊ b÷•h:©‡¦s`BTÁRû¾g⻩‹jø sF¢àJøFl‘È•Xᓁà~*j¯ +(ÚÕ6-£¯÷GŠØy‚<Ç’.F‹Hœw(+)ÜÜâÈzÄäT§FߘãÏ;DmVœ3Àu@mÚüXÝü•3B¨òÌÁÛ<·ÃÜ z,Ì@õÅ·d2]ü8s÷IôÞ¯^Ç9¢u„~ëAŸï4«M? K]­ÅàPl@s_ p:°¬ZR”´›JC[CS.h‹ƒïËœ«Æ]–÷ó‚wR×k7X‰k›‘´ù¦=¡«‰¨¨Â')—71ó’c‡Ðúµ `é.{§p¹ój\Ž{1h{o±Ý=áUÊïGÖŒõ–-BÄm+AZX¶¡ ïHðæ¥JmÙ;…䡟ˆ¦ ° äšiÉg«$üMk5¤L“’çÊvïâï ,=f“"íἊ5ô¬x6{ɏžID0e¸vçmi'︧ºð9$ò¹÷*£’9ÿ ²TÔ…×>JV¥}Œ}$p[bÔ®*[jzS*8 ”·T›Í–ñUîƒwo$áè=LT™ç—~ô·¤ÈÚ$榍q‰„+´kFm)ž‹©i–ËqÞŠ‰à¶ü( ‚•§ •°ò·‡#5ª•µÊ﯅¡X¨šÁ*F#TXJÊ ušJVÍ&=iÄs1‚3•'fý§5Ñ<=[íÞ­ PÚ;ѱÌ_~Ä££8rÞ ²w;’hDT°>ÈG¬8Á²ÚzŽ®ò®qZcqJêäÞ-ö[ܘbň±çb“ж31²n×iƒðÕ;1¶þÉ ªX‰,ßqÏ$>•î íZ¥Z 1{ç൵+ƒÕµ¥°T$§K]á»Ûï*·¤tMI’ÂZbŽÕiÒ˜}bÓ0£ª5›¨ [5Ž^ÝœWøÂÝh° ¢OWun£¤5 a2Z.G2³YL]jåtì”ä ÁÓ‘%"©<Ôúʰsº UZvä‡ÄiÆÒM .÷V·™ø#kèýiíÌ–ª)µT[)BˆõÑ xB¾B€ÖT¨.¥~ð@VĶr#¸ü*åZNDŽH;âi ],©£öØpù(šºãö¼T.uCê•4@ÿ GÕÛ)Cx›®0ø#:ÏðFÒbR\(€€Ä®fã4Þ‰Fä¯HXƒÅ,†öEÑÔÜ]Öv²?tLÃvBY£ú6Êu5ÅAQ³1‘’¬x–HŒÐ‡ ^ ¸KwJôÖŽ5×CÚ¨vÜ«/B0$×k°=ðbÇ(Ï)w±A†Á† 11Í=èQšµ626ŒÜ/`G«µ<}—-Ö7KEHÈÉðóȤmݱû±·ø«Snmá=“䫚mݱŸ¡¶~ó·“äUóJæúòB|E LêŽy´jDÔ$G¢þÐñ7óR8ýÒ…Ç› WVe#·Ÿ p·Fx~•ݤF÷0Èÿ K¯æS<6’¡WШ; ´ÿ ¥Êø\Òuî†åÝ–VNœkÒ7oòX¨Á­Ø÷FÎÑä±g÷ÿ M~Çî=p,X´ ÝÌÚÅ‹’ÃjÖ.ØöÏñ qïQ¤ÓZE†° =6·]܈ s¸>v•Ž^Ý\wq9r‰Î\¸¡kURÒ$­*‹Nq?Þª*!sŠÆ:TU_u±T+øX¡ ®¹¡,ÄâÃBTsÜ$Ø›4m椴zÜK]’’›Pƒ @€#â˜`é¹=I‡fiV•Ôî“nRm+µFPOhÍ0B£ €+¬5c v•:P'ÒyÎ ‰V~‚Ó†ÖuókDoh$å\*ö%Ю=£«…aȼ½÷Û.-½VŒŠ¼'lyî±1¬3ó#ÞE¿ÔS¤gV£m›=§\û"—WU¤ÚǼÿ ÂnÁGŒÃ ‚õN D³õNÚíŒÕ;HôyÄÈ©P¹Ä{:?R‘Ô¨âF÷ø£bÅó® JS|‚R÷ivýáâ€Æé¡è³´IئÑT!§˜•ت‚¬â@q€wnïCWÄ@JU€ê¯m6]Ï:£âx'+ÒðXvÓ¦Úm=–´7œ $ì“B£~p%ÕŸUþ« N@¼üï~w˜ñø5®—'Ôe»¤5ã//€ž~‰Tþ›Å7•#¤× Íö pÄ$ùeåì*«ÓŠEØWEÈsßg ¦ûvžSsLpºÊW–âµEWöˬH; ™!CYõZ ÃÄf æ#1W. \uWâ\,\Çf j’<qTbên›Î[vxx£ë 'ö¨1›˜ÀM¼Pÿ H)ƒêêŒA7s,|F“ 꺸k³9Ìö*ç®;Ö!Ö$Eiž•¹ÒÚ†ýóéÝû¾ÕS®ó$’NÝäŸz¤5r¦ãÄÃD÷Üø!°ø‡Ô&@m™Ì^Ãä­d q5Lnÿ N;.6½·N|#ä"1Nƒx“ã<3('&ñßt  ~ªu”1Tb㫨9ê–›–bìd$ߣ=#ÕãÒmU¯eí$EFù5ýYô櫨æì™Ç—±ssM]·á¿0ÕåJRÓªîiƒ+O58ÖñªŠÒx" \µâá¨i’¤i —Ö ” M+M¤ë9‚‰A¦°Qõ¾ßøK~¼Ã‘g…Ö´~÷Ï[3GUœÒ½#…kàÔ®Ò”‰³·dWV‰IP‰Ú8u¹”E ÖqLj¾êÕCBš{A^Âß;–¨`¯¬ìö ˼ ×tìø.tƐm*n¨y4o&Àx¥n¦×î‡aupáÛj8¿m›è¶ã!o½;ß0y^ý×^EÑ¿ÒjzŒ­)vÚÑnÄL …^ªô× ‡—‚3k Îý­hï]içå–îÏ*÷ñþ»Ô CÒjøjÍznˆ´ ¹#b'Fô‹ ‰v¥'’à'T´ƒHýÍ%M‰ ƒ&ÆÇŒï1 ‘ –Þ ‰i¬s žR-Ÿ kЬá¬7:þ 0ŒÅÒÕ/aÙ¬ÃÝ#Úøœ ©aiVc‰. ¹¦ãµ” ›Yg¦›ÆÎýº°f³7ƒhá·¸­}&D9¡ÂsÉÙÞèŠõØàC™¨ñbFC|´Ü(ŸƒÚÒ-%»'a Ì¿)ËÇn¿úÿ ÞŽX…4ÊÅH^ôΑí@ù¹Eh¶“L8Çjù ¼ÎåVªóR©Ï5uà V4lZß®=€xÖŸ–ÑÈ ÷”¨°¾__yM1tÉ?uÆþIkÄgæ@þ[¢†°XÃJ£j·:nkÅ¢u ‘}âGzö­/IµèЬ¼48q¦F°ŽR¼=ûì{´¯RýicS ÕÛ íNtÍÙï£,w4rêì®»~x(©Uñ§#Ñ&œÕ¤>ÎåÍÓ9’Ö{9eV­[Öjâ²ãu]˜å2›qÑšÕJç0€sÄ|Êëè0튔bÁ>“{×_F`Ø©ºê:µä,v¤ðfc1±"«ÔÍän1#=· Âøv~H½ÐßA¾¿Ü€Óš]Õ; I¾÷ç‚Qi†î¹9ywÔKG˜áñ zQY—§ÃÕZ07§X‚ Áh;ÁM)iÌCH-¯T‘ë|A0{Ò½LÚ–TâÖkÜ’dÀ“rmm»”جPF³ÖcbE§T€ÒxKºû’Ó®7±²(\4ŽÃ¸Uu@j™yĵ;³µ!Á¢b.W¤=mõ´êµK k ¸K^ÜÛ#p*Ü14qkZç5ïë †°5Ï%ÍÛ<Õ¤×Ô¥ê†C Õ´¼ú$ƒÖ“”]Ù¬qÞÚ[4©ý!ûÏ—Áb쳐XµA¬â~`›Çr¸8ìùÝ䫦<>ä÷«?xs´ÇÑ /á;¹øüÊÈÙà{"@Žïzâ¬[âß‚ U_<ÇŸ½4èN˜ú61®qŠu ¦þF£»äJ_ˆÙÎ~ ÞAã–݄ϗrŠD;xTž‘ô`É«…suãO`?³à™ô Lý#Íc5öoæØ‚y´´÷«ZR§<&JÇ+éâô´€i!Àˆ0æAoàðLèÖ-2ŸõW.’t^–(KÁmHµV@xÜÇy®Ñø­â^:Ú3w· 7½¹°ñ¸â¹®:',«Mœ—n­Á+Ãbš LÈ‘ÄnRÓÅœ%¦²‰¨ùQ:¤f‚ "PÕtô¸…cæl…&˜Ú˜Ôkv‹ž+vŠ,=¢v­6—Xy*¥t£«<™:“aîϲ=¦6rO]XI¿Œ÷¤zÚ­›¶ 6÷”w\d ü~v®ˆÌk«^m<ÿ ¢‰Õ\)ùºŽ;… lîÙÅEŠ®cѾ@vnMÏ,¼“ñ•ŽBxðÃzãÇç%3ˆ"}Ù•Åî> BÉú;Ò]V+P˜F_´ßé> Øše|ï‡ÄOmFæÇ ãqÞ$/xÐx­z`ï9"œÜij‚!7.\Td…9M‡•iŽ‹¾‘50ÞŽn¥ß4ÉôO ¹*í^QêËÜÇÌ8=ާs‰'ÂëÙ«á%Pú[O †ÅP¯Vsް.‰,kc¶ ¬A9n˜XÎ-ÞšN["¹QÕ‰ƒMýÁߺXJæÍaLj¾×Ãmã¾ãÚ uñÒþåQô¦¥ /ÄUx:‚ÍÜ’ Đ©ØÝ3V¨‰ÕnÐ6ó*óúK­«…c ¯U òhsý­jóÔj#,ímŒRµ«lbïUTŒÑ8†Ä0œÏr`ð¡¬É Ї ë"À² ™ 6¥ f¶ ¢ÚoܱԷ-<Àî)†a¶ž'Ú»¨TXqØæ¶÷YÄHy˜9ÈIW­YÀuMFë ºÏ’AqÌ4·/Ú †ô'i$øä­=Ä Ý|öK×40è|È6p‘0§)o¥ctî§H+CA-“ xØ|ÐXАç l8íºð3Ø:³¤¬KX¯UÿÙ # Copyright 2004-2005 Elemental Security, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. # Pgen imports from . import grammar, token, tokenize class PgenGrammar(grammar.Grammar): pass class ParserGenerator(object): def __init__(self, filename, stream=None): close_stream = None if stream is None: stream = open(filename) close_stream = stream.close self.filename = filename self.stream = stream self.generator = tokenize.generate_tokens(stream.readline) self.gettoken() # Initialize lookahead self.dfas, self.startsymbol = self.parse() if close_stream is not None: close_stream() self.first = {} # map from symbol name to set of tokens self.addfirstsets() def make_grammar(self): c = PgenGrammar() names = self.dfas.keys() names.sort() names.remove(self.startsymbol) names.insert(0, self.startsymbol) for name in names: i = 256 + len(c.symbol2number) c.symbol2number[name] = i c.number2symbol[i] = name for name in names: dfa = self.dfas[name] states = [] for state in dfa: arcs = [] for label, next in sorted(state.arcs.iteritems()): arcs.append((self.make_label(c, label), dfa.index(next))) if state.isfinal: arcs.append((0, dfa.index(state))) states.append(arcs) c.states.append(states) c.dfas[c.symbol2number[name]] = (states, self.make_first(c, name)) c.start = c.symbol2number[self.startsymbol] return c def make_first(self, c, name): rawfirst = self.first[name] first = {} for label in sorted(rawfirst): ilabel = self.make_label(c, label) ##assert ilabel not in first # XXX failed on <> ... != first[ilabel] = 1 return first def make_label(self, c, label): # XXX Maybe this should be a method on a subclass of converter? ilabel = len(c.labels) if label[0].isalpha(): # Either a symbol name or a named token if label in c.symbol2number: # A symbol name (a non-terminal) if label in c.symbol2label: return c.symbol2label[label] else: c.labels.append((c.symbol2number[label], None)) c.symbol2label[label] = ilabel return ilabel else: # A named token (NAME, NUMBER, STRING) itoken = getattr(token, label, None) assert isinstance(itoken, (int, long)), label assert itoken in token.tok_name, label if itoken in c.tokens: return c.tokens[itoken] else: c.labels.append((itoken, None)) c.tokens[itoken] = ilabel return ilabel else: # Either a keyword or an operator assert label[0] in ('"', "'"), label value = eval(label) if value[0].isalpha(): # A keyword if value in c.keywords: return c.keywords[value] else: c.labels.append((token.NAME, value)) c.keywords[value] = ilabel return ilabel else: # An operator (any non-numeric token) itoken = grammar.opmap[value] # Fails if unknown token if itoken in c.tokens: return c.tokens[itoken] else: c.labels.append((itoken, None)) c.tokens[itoken] = ilabel return ilabel def addfirstsets(self): names = self.dfas.keys() names.sort() for name in names: if name not in self.first: self.calcfirst(name) #print name, self.first[name].keys() def calcfirst(self, name): dfa = self.dfas[name] self.first[name] = None # dummy to detect left recursion state = dfa[0] totalset = {} overlapcheck = {} for label, next in state.arcs.iteritems(): if label in self.dfas: if label in self.first: fset = self.first[label] if fset is None: raise ValueError("recursion for rule %r" % name) else: self.calcfirst(label) fset = self.first[label] totalset.update(fset) overlapcheck[label] = fset else: totalset[label] = 1 overlapcheck[label] = {label: 1} inverse = {} for label, itsfirst in overlapcheck.iteritems(): for symbol in itsfirst: if symbol in inverse: raise ValueError("rule %s is ambiguous; %s is in the" " first sets of %s as well as %s" % (name, symbol, label, inverse[symbol])) inverse[symbol] = label self.first[name] = totalset def parse(self): dfas = {} startsymbol = None # MSTART: (NEWLINE | RULE)* ENDMARKER while self.type != token.ENDMARKER: while self.type == token.NEWLINE: self.gettoken() # RULE: NAME ':' RHS NEWLINE name = self.expect(token.NAME) self.expect(token.OP, ":") a, z = self.parse_rhs() self.expect(token.NEWLINE) #self.dump_nfa(name, a, z) dfa = self.make_dfa(a, z) #self.dump_dfa(name, dfa) oldlen = len(dfa) self.simplify_dfa(dfa) newlen = len(dfa) dfas[name] = dfa #print name, oldlen, newlen if startsymbol is None: startsymbol = name return dfas, startsymbol def make_dfa(self, start, finish): # To turn an NFA into a DFA, we define the states of the DFA # to correspond to *sets* of states of the NFA. Then do some # state reduction. Let's represent sets as dicts with 1 for # values. assert isinstance(start, NFAState) assert isinstance(finish, NFAState) def closure(state): base = {} addclosure(state, base) return base def addclosure(state, base): assert isinstance(state, NFAState) if state in base: return base[state] = 1 for label, next in state.arcs: if label is None: addclosure(next, base) states = [DFAState(closure(start), finish)] for state in states: # NB states grows while we're iterating arcs = {} for nfastate in state.nfaset: for label, next in nfastate.arcs: if label is not None: addclosure(next, arcs.setdefault(label, {})) for label, nfaset in sorted(arcs.iteritems()): for st in states: if st.nfaset == nfaset: break else: st = DFAState(nfaset, finish) states.append(st) state.addarc(st, label) return states # List of DFAState instances; first one is start def dump_nfa(self, name, start, finish): print "Dump of NFA for", name todo = [start] for i, state in enumerate(todo): print " State", i, state is finish and "(final)" or "" for label, next in state.arcs: if next in todo: j = todo.index(next) else: j = len(todo) todo.append(next) if label is None: print " -> %d" % j else: print " %s -> %d" % (label, j) def dump_dfa(self, name, dfa): print "Dump of DFA for", name for i, state in enumerate(dfa): print " State", i, state.isfinal and "(final)" or "" for label, next in sorted(state.arcs.iteritems()): print " %s -> %d" % (label, dfa.index(next)) def simplify_dfa(self, dfa): # This is not theoretically optimal, but works well enough. # Algorithm: repeatedly look for two states that have the same # set of arcs (same labels pointing to the same nodes) and # unify them, until things stop changing. # dfa is a list of DFAState instances changes = True while changes: changes = False for i, state_i in enumerate(dfa): for j in range(i+1, len(dfa)): state_j = dfa[j] if state_i == state_j: #print " unify", i, j del dfa[j] for state in dfa: state.unifystate(state_j, state_i) changes = True break def parse_rhs(self): # RHS: ALT ('|' ALT)* a, z = self.parse_alt() if self.value != "|": return a, z else: aa = NFAState() zz = NFAState() aa.addarc(a) z.addarc(zz) while self.value == "|": self.gettoken() a, z = self.parse_alt() aa.addarc(a) z.addarc(zz) return aa, zz def parse_alt(self): # ALT: ITEM+ a, b = self.parse_item() while (self.value in ("(", "[") or self.type in (token.NAME, token.STRING)): c, d = self.parse_item() b.addarc(c) b = d return a, b def parse_item(self): # ITEM: '[' RHS ']' | ATOM ['+' | '*'] if self.value == "[": self.gettoken() a, z = self.parse_rhs() self.expect(token.OP, "]") a.addarc(z) return a, z else: a, z = self.parse_atom() value = self.value if value not in ("+", "*"): return a, z self.gettoken() z.addarc(a) if value == "+": return a, z else: return a, a def parse_atom(self): # ATOM: '(' RHS ')' | NAME | STRING if self.value == "(": self.gettoken() a, z = self.parse_rhs() self.expect(token.OP, ")") return a, z elif self.type in (token.NAME, token.STRING): a = NFAState() z = NFAState() a.addarc(z, self.value) self.gettoken() return a, z else: self.raise_error("expected (...) or NAME or STRING, got %s/%s", self.type, self.value) def expect(self, type, value=None): if self.type != type or (value is not None and self.value != value): self.raise_error("expected %s/%s, got %s/%s", type, value, self.type, self.value) value = self.value self.gettoken() return value def gettoken(self): tup = self.generator.next() while tup[0] in (tokenize.COMMENT, tokenize.NL): tup = self.generator.next() self.type, self.value, self.begin, self.end, self.line = tup #print token.tok_name[self.type], repr(self.value) def raise_error(self, msg, *args): if args: try: msg = msg % args except: msg = " ".join([msg] + map(str, args)) raise SyntaxError(msg, (self.filename, self.end[0], self.end[1], self.line)) class NFAState(object): def __init__(self): self.arcs = [] # list of (label, NFAState) pairs def addarc(self, next, label=None): assert label is None or isinstance(label, str) assert isinstance(next, NFAState) self.arcs.append((label, next)) class DFAState(object): def __init__(self, nfaset, final): assert isinstance(nfaset, dict) assert isinstance(iter(nfaset).next(), NFAState) assert isinstance(final, NFAState) self.nfaset = nfaset self.isfinal = final in nfaset self.arcs = {} # map from label to DFAState def addarc(self, next, label): assert isinstance(label, str) assert label not in self.arcs assert isinstance(next, DFAState) self.arcs[label] = next def unifystate(self, old, new): for label, next in self.arcs.iteritems(): if next is old: self.arcs[label] = new def __eq__(self, other): # Equality test -- ignore the nfaset instance variable assert isinstance(other, DFAState) if self.isfinal != other.isfinal: return False # Can't just return self.arcs == other.arcs, because that # would invoke this method recursively, with cycles... if len(self.arcs) != len(other.arcs): return False for label, next in self.arcs.iteritems(): if next is not other.arcs.get(label): return False return True __hash__ = None # For Py3 compatibility. def generate_grammar(filename="Grammar.txt"): p = ParserGenerator(filename) return p.make_grammar()