Submit
Path:
~
/
/
proc
/
thread-self
/
root
/
proc
/
thread-self
/
root
/
usr
/
lib
/
python2.7
/
site-packages
/
pygments
/
lexers
/
File Content:
special.pyo
� _aTc @ s� d Z d d l Z d d l m Z d d l m Z m Z m Z d d l m Z m Z m Z d d g Z d e f d � � YZ i Z e j d � Z d e f d � � YZ d S( s� pygments.lexers.special ~~~~~~~~~~~~~~~~~~~~~~~ Special lexers. :copyright: Copyright 2006-2014 by the Pygments team, see AUTHORS. :license: BSD, see LICENSE for details. i����N( t Lexer( t Tokent Errort Text( t get_choice_optt text_typet BytesIOt TextLexert RawTokenLexerc B s8 e Z d Z d Z d g Z d g Z d g Z d � Z RS( s3 "Null" lexer, doesn't highlight anything. s Text onlyt texts *.txts text/plainc c s d t | f Vd S( Ni ( R ( t selfR ( ( s; /usr/lib/python2.7/site-packages/pygments/lexers/special.pyt get_tokens_unprocessed s ( t __name__t __module__t __doc__t namet aliasest filenamest mimetypesR ( ( ( s; /usr/lib/python2.7/site-packages/pygments/lexers/special.pyR s s .*? c B sG e Z d Z d Z d g Z g Z d g Z d � Z d � Z d � Z RS( sq Recreate a token stream formatted with the `RawTokenFormatter`. This lexer raises exceptions during parsing if the token stream in the file is malformed. Additional options accepted: `compress` If set to ``"gz"`` or ``"bz2"``, decompress the token stream with the given compression algorithm before lexing (default: ``""``). s Raw token datat raws application/x-pygments-tokensc K s8 t | d d d d d g d � | _ t j | | � d S( Nt compresst t nonet gzt bz2( R R R t __init__( R t options( ( s; /usr/lib/python2.7/site-packages/pygments/lexers/special.pyR 9 s c c s� t | t � r! | j d � } n | j d k ri d d l } | j d d d t | � � } | j � } n- | j d k r� d d l } | j | � } n | j d � d } x+ | j | � D] \ } } } | | f Vq� Wd S( Nt asciiR i����R t rbi R s ( t isinstanceR t encodeR t gzipt GzipFileR t readR t decompresst stripR ( R R R t gzipfileR t it tt v( ( s; /usr/lib/python2.7/site-packages/pygments/lexers/special.pyt get_tokens>