Transforms and DataBlocks.
 

Utils

get_splits[source]

get_splits(dataset, train='train', valid='validation')

class TitledStrEx[source]

TitledStrEx(*args, **kwargs) :: UserString

TitledStr with option to set label

class PreprocCategorize[source]

PreprocCategorize(vocab=None) :: DisplayedTransform

Tranfrom for proper displaying preprocessed categorical labels

PreprocCategoryBlock[source]

PreprocCategoryBlock(vocab:Optional[List[T]]=None)

TransformBlock for preprocessed categorical labels with optional label names vocab

class TextGetter[source]

TextGetter(s1:str='text', s2:str=None, prefix1:str='', prefix2:str='') :: ItemTransform

Retrieves text fields s1 and [optionally] s2. Adds corresponding prefixes

class KeyGetter[source]

KeyGetter(keys:Iterable[T_co]) :: ItemTransform

Returns a dict with keys retrieved from input sample

class TransTensorText[source]

TransTensorText(x, **kwargs) :: TensorText

Semantic type for a tensor representing text

find_first[source]

find_first(t, e)

split_by_sep[source]

split_by_sep(t, sep_tok_id)

Transforms

class TokTransform[source]

TokTransform(pretrained_model_name=None, tokenizer_cls=AutoTokenizer, config=None, tokenizer=None, is_lm=False, padding=False, truncation=False, max_length=None, preprocessed=False, skip_special_tokens=False, **kwargs) :: Transform

Tokenizes single piece of text using pretrained tokenizer

class TokBatchTransform[source]

TokBatchTransform(pretrained_model_name=None, tokenizer_cls=AutoTokenizer, config=None, tokenizer=None, is_lm=False, with_labels=False, padding=True, truncation=True, max_length=None, do_targets=False, target_pad_id=-100, **kwargs) :: Transform

Tokenizes texts in batches using pretrained HuggingFace tokenizer. The first element in a batch can be single string or 2-tuple of strings. If with_labels=True the "labels" are added to the output dictionary.

class PadBatchTransform[source]

PadBatchTransform(pretrained_model_name=None, tokenizer_cls=AutoTokenizer, config=None, tokenizer=None, is_lm=False, with_labels=False, padding=True, truncation=True, max_length=None, do_targets=False, target_pad_id=-100, **kwargs) :: Transform

Delegates (__call__,decode,setup) to (encodes,decodes,setups) if split_idx matches

untuple[source]

untuple(l)

to_tuple[source]

to_tuple(x)

TODOs:

  • verify CLM works as well and mb rename masking_func as it would be not only for masking
  • add permutation LM

class LMBatchTfm[source]

LMBatchTfm(pretrained_model_name=None, tokenizer_cls=AutoTokenizer, config=None, tokenizer=None, mlm=True, masking_func=None, whole_word_masking=False, mlm_probability=0.15) :: Transform

Collates batch of pretokenized and chunked inputs into a batch and creates labels as defined by masking_func

class Undict[source]

Undict(enc=None, dec=None, split_idx=None, order=None) :: ItemTransform

A transform that always take tuples as items

class UndictS2S[source]

UndictS2S(enc=None, dec=None, split_idx=None, order=None) :: ItemTransform

A transform that always take tuples as items

DataBlocks

class TransformersTextBlock[source]

TransformersTextBlock(pretrained_model_name=None, tokenizer_cls=AutoTokenizer, config=None, tokenizer=None, preprocessed=False, do_targets=False, group_by_len=True, is_lm=False, with_labels=False, padding=True, truncation=True, max_length=None, target_pad_id=-100) :: TransformBlock

A TransformBlock for texts using pretrained tokenizers from Huggingface

DataLoaders for classification

path = untar_data(URLs.IMDB_SAMPLE)
texts = pd.read_csv(path/'texts.csv')

model_name = 'distilbert-base-uncased'
max_len = 128
bs = 8
val_bs = 16
tokenizer = AutoTokenizer.from_pretrained(model_name)
dblock = DataBlock(blocks = [TransformersTextBlock(tokenizer=tokenizer),
                             CategoryBlock()],
                   get_x=ItemGetter('text'),
                   get_y=ItemGetter('label'),
                   splitter=ColSplitter())
dls = dblock.dataloaders(texts, bs=bs, val_bs=val_bs)
dls.show_batch(max_n=4)
text category
0 raising victor vargas : a review < br / > < br / > you know, raising victor vargas is like sticking your hands into a big, steaming bowl of oatmeal. it's warm and gooey, but you're not sure if it feels right. try as i might, no matter how warm and gooey raising victor vargas became i was always aware that something didn't quite feel right. victor vargas suffers from a certain overconfidence on the director's part. apparently, the director thought that the ethnic backdrop of a latino family on the lower east side, and an idyllic storyline would make the film critic proof. he was right, but it didn't fool me. raising victor vargas is the story about a seventeen - year old boy called, you guessed it, victor vargas ( victor rasuk ) who lives his teenage years chasing more skirt than the rolling stones could do negative
1 the shop around the corner is one of the sweetest and most feel - good romantic comedies ever made. there's just no getting around that, and it's hard to actually put one's feeling for this film into words. it's not one of those films that tries too hard, nor does it come up with the oddest possible scenarios to get the two protagonists together in the end. in fact, all its charm is innate, contained within the characters and the setting and the plot... which is highly believable to boot. it's easy to think that such a love story, as beautiful as any other ever told, * could * happen to you... a feeling you don't often get from other romantic comedies, however sweet and heart - warming they may be. < br / > < br / > alfred kralik ( james stewart ) and clara novak ( margaret positive
2 now that che ( 2008 ) has finished its relatively short australian cinema run ( extremely limited release : 1 screen in sydney, after 6wks ), i can guiltlessly join both hosts of " at the movies " in taking steven soderbergh to task. < br / > < br / > it's usually satisfying to watch a film director change his style / subject, but soderbergh's most recent stinker, the girlfriend experience ( 2009 ), was also missing a story, so narrative ( and editing? ) seem to suddenly be soderbergh's main challenge. strange, after 20 - odd years in the business. he was probably never much good at narrative, just hid it well inside " edgy " projects. < br / > < br / > none of this excuses him this present, almost diabolical failure. as david stratton warns, " two parts of che don't ( even negative
3 this film sat on my tivo for weeks before i watched it. i dreaded a self - indulgent yuppie flick about relationships gone bad. i was wrong ; this was an engrossing excursion into the screwed - up libidos of new yorkers. < br / > < br / > the format is the same as max ophuls'" la ronde, " based on a play by arthur schnitzler, who is given an " inspired by " credit. it starts from one person, a prostitute, standing on a street corner in brooklyn. she is picked up by a home contractor, who has sex with her on the hood of a car, but can't come. he refuses to pay her. when he's off peeing, she answers his cell phone and takes a message. she runs away with his keys. < br / > < br / > then the story switches to positive

HuggingFace models can compute loss, to use loss computed by model you should pass with_labels = True to datablock constructor. The show_batch result didn't change, but actually the labels are moved to dict object, which is the first element of a batch.

dblock = DataBlock(blocks = [TransformersTextBlock(tokenizer=tokenizer, with_labels=True), CategoryBlock()],
                   get_x=ItemGetter('text'),
                   get_y=ItemGetter('label'),
                   splitter=ColSplitter())
dls = dblock.dataloaders(texts, bs=8)
dls.show_batch(max_n=4)
text category
0 raising victor vargas : a review < br / > < br / > you know, raising victor vargas is like sticking your hands into a big, steaming bowl of oatmeal. it's warm and gooey, but you're not sure if it feels right. try as i might, no matter how warm and gooey raising victor vargas became i was always aware that something didn't quite feel right. victor vargas suffers from a certain overconfidence on the director's part. apparently, the director thought that the ethnic backdrop of a latino family on the lower east side, and an idyllic storyline would make the film critic proof. he was right, but it didn't fool me. raising victor vargas is the story about a seventeen - year old boy called, you guessed it, victor vargas ( victor rasuk ) who lives his teenage years chasing more skirt than the rolling stones could do negative
1 the shop around the corner is one of the sweetest and most feel - good romantic comedies ever made. there's just no getting around that, and it's hard to actually put one's feeling for this film into words. it's not one of those films that tries too hard, nor does it come up with the oddest possible scenarios to get the two protagonists together in the end. in fact, all its charm is innate, contained within the characters and the setting and the plot... which is highly believable to boot. it's easy to think that such a love story, as beautiful as any other ever told, * could * happen to you... a feeling you don't often get from other romantic comedies, however sweet and heart - warming they may be. < br / > < br / > alfred kralik ( james stewart ) and clara novak ( margaret positive
2 well, what can i say. < br / > < br / > " what the bleep do we know " has achieved the nearly impossible - leaving behind such masterpieces of the genre as " the postman ", " the dungeon master ", " merlin ", and so fourth, it will go down in history as the single worst movie i have ever seen in its entirety. and that, ladies and gentlemen, is impressive indeed, for i have seen many a bad movie. < br / > < br / > this masterpiece of modern cinema consists of two interwoven parts, alternating between a silly and contrived plot about an extremely annoying photographer, abandoned by her husband and forced to take anti - depressants to survive, and a bunch of talking heads going on about how quantum physics supposedly justifies their new - agy pseudo - philosophy. basically, if negative
3 the year 2005 saw no fewer than 3 filmed productions of h. g. wells'great novel, " war of the worlds ". this is perhaps the least well - known and very probably the best of them. no other version of wotw has ever attempted not only to present the story very much as wells wrote it, but also to create the atmosphere of the time in which it was supposed to take place : the last year of the 19th century, 1900 using wells'original setting, in and near woking, england. < br / > < br / > imdb seems unfriendly to what they regard as " spoilers ". that might apply with some films, where the ending might actually be a surprise, but with regard to one of the most famous novels in the world, it seems positively silly. i have no sympathy for people who have neglected to positive

Language modeling

class TransformersLMBlock[source]

TransformersLMBlock(pretrained_model_name=None, tokenizer_cls=AutoTokenizer, config=None, tokenizer=None, mlm=True, masking_func=None, whole_word_masking=False, mlm_probability=0.15, preprocessed=True, group_by_len=False, is_lm=False, padding=False, truncation=False, max_length=None, skip_special_tokens=False) :: TransformBlock

A TransformBlock for language modelling using pretrained tokenizers from Huggingface

Dataloaders for language modeling

tokenize[source]

tokenize(batch)

group_texts[source]

group_texts(examples)

path = untar_data(URLs.IMDB_SAMPLE)
model_name = 'distilbert-base-uncased'
max_length = 128
bs = 8
val_bs = 16
tokenizer = AutoTokenizer.from_pretrained(model_name)

ds = datasets.Dataset.from_csv((path/'texts.csv').as_posix())
ds = ds.map(tokenize, remove_columns=ds.column_names)
block_size = max_length
lm_ds = ds.map(group_texts, batched=True, batch_size=1000)
Using custom data configuration default-f78ec54769bd79c4
Downloading and preparing dataset csv/default (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /root/.cache/huggingface/datasets/csv/default-f78ec54769bd79c4/0.0.0...
Dataset csv downloaded and prepared to /root/.cache/huggingface/datasets/csv/default-f78ec54769bd79c4/0.0.0. Subsequent calls will reuse this data.
dblock = DataBlock(blocks=[TransformersLMBlock(tokenizer=tokenizer)],
                   splitter=RandomSplitter())
dls = dblock.dataloaders(lm_ds, bs=bs, val_bs=val_bs)
dls.show_batch(max_n=4)
text
0 get any better because [MASK] plot is flawed [MASK] begin with [MASK] it never works [MASK] and like [MASK] predecessors, [MASK] acting is med [MASK]cre. < treat / [MASK] < br [MASK] > [MASK] plot has a [MASK] ending which will surprise any one who has never seen [MASK] movie before [MASK] the ending doesn't [MASK] the story. [MASK] this movie ended [MASK] minutes earlier, it would have worked and have been very satisfying and i [MASK] have thought it more worthwhile [MASK] but here is the spoiler and that [MASK] the end crime does pay because the criminal 42 not caught. i never like this message resulting from a movie. [SEP] [CLS] warning! [MASK] review
1 girls and i think i was really kind giving it a 4 out of 10 kn what could [MASK] hart been a wonderful story [MASK] actually talked set of more or less decent actors became a total farce in my eyes. there are so [MASK] [MASK] [MASK]s in that flick, the women'[MASK] [MASK] is just awful [MASK] most of the scenes are more than unrealistic or seem fake andhra there's no real passion in this movie but a bunch [MASK] actors over [MASK] acting over any limits that it hurts. it's boise funny enough to be a [MASK], it's too [MASK] - sad to really touch, so in my eyes it
2 useless and the direction [MASK] quite unoriginal when it comes to [MASK]ogs scenes. but all that [MASK]'t really matter, for [MASK] the bourne ultimatum [MASK] is an action [MASK]. and the action scenes are rather impressive. [MASK] br / > < br / > everyone here is talking [MASK] the " waterloo scene " and the " tanger pursuit " and everyone [MASK] s right. i [MASK] enjoyed the fight in tanger, that reminds my [MASK] its exaggeration and crazi [MASK] the works of tsui hark. visually inventive scenes, [MASK] of intelligent [MASK] parts and a good reflection on [MASK]'s contemporary [MASK]s
3 close [MASK] is released from prison after being " cured " of her obsession with fur by a psychologist named dr. pavlov ( ugh! ) [MASK] but the " cure " is broken when cruella hears the toll of [MASK] ben, and she once again goes on a mad [MASK] to make herself the perfect coat out of dalmation [MASK]. < br / > < br / > this movie [MASK] bad on so [MASK] levels, starting with the [MASK] that it'[MASK] a " thanksgiving [MASK] schlock " movie designed to suck every last available dime out of the disney [MASK] machine [MASK] glenn [MASK] over - over - over - over - acts

Multiple Choice

class MultiChoiceTransform[source]

MultiChoiceTransform(sentence_keys, ending_keys, pretrained_model_name=None, tokenizer_cls=AutoTokenizer, config=None, tokenizer=None, with_labels=False, padding=True, truncation=True, max_length=None, **kwargs) :: Transform

Processes inputs for multiple choice

class MultiChoiceBlock[source]

MultiChoiceBlock(sentence_keys, ending_keys, pretrained_model_name=None, tokenizer_cls=AutoTokenizer, config=None, tokenizer=None, preprocessed=False, group_by_len=True, with_labels=False, padding=True, truncation=True, max_length=None) :: TransformBlock

A TransformBlock for multiple choice using pretrained tokenizers from Huggingface

Token Classification

class PadTokBatchTransform[source]

PadTokBatchTransform(pretrained_model_name=None, tokenizer_cls=AutoTokenizer, config=None, tokenizer=None, with_labels=False, padding=True, truncation=True, max_length=None, label_vocab=None, target_pad_id=-100, **kwargs) :: Transform

Delegates (__call__,decode,setup) to (encodes,decodes,setups) if split_idx matches

class TokenClassificationBlock[source]

TokenClassificationBlock(pretrained_model_name=None, tokenizer_cls=AutoTokenizer, config=None, tokenizer=None, with_labels=True, label_vocab=None, group_by_len=True, padding=True, truncation=True, max_length=None, target_pad_id=-100) :: TransformBlock

A TransformBlock for token classification using pretrained tokenizers from Huggingface

Fin