Whatdowemeanwhenwesay a devicewhileusually a mobiledevicebasicallyourphones, soourphonesarewithusallthetime.
Weinteractwiththemsomanytimesduringtheday, andmorethanphonescomewith a largenumberoffsensorsonthem, whichgiveusreallyrichdataaboutthephysicalworldaroundus.
Anothercategoryofdevicesiswhatwecalledgedevices, andthisindustryhasseen a hugeexplosioninthelastfewyears.
Sointhepast, whatwewouldhavedonewastowrite a lotofrulesthatwerehardported, veryspecificaboutsomespecificcharacteristicsthatweexpectedtoseeinpartsoftheimage.
Andlastly, we'rein a positiontotakeadvantageoffallthesensordatathat's alreadyavailableandaccessibleonthedevice.
Sothisisallgreat.
Butthere's a catchlikethey'realwaysis, AndthecatchisthatdoingondeviceMLishard.
Manyofthesedeviceshavesomeprettytightconstraints.
Theyhavesmallbatteries, tightmemoryandverylittlecomputationPowerTensorflowwasbuiltforprocessingontheserver, anditwasn't a greatfitfortheseusecases.
Andthatisthereasonthatwebuilttensorflowlight.
It's a lightweightmachinelearninglibraryformobileandembeddedplatforms, sothisis a highlevelorreviewofthesystem.
Itconsistsof a converterwhereweconvertmodelsfromTensorflowformattotensoflowlightformatandforefficiencyreasons.
Weuse a format, whichisdifferentthanitconsistsofaninterpreter, whichrunsanddevice.
Therearelibraryoffopsandcardinals, andthenwehave a B I's, whichallowustotakeadvantageofhardwareaccelerationwheneveritisavailable.
TensorflowLightiscrossplatform, soitworksonAndroid, IOS, Lennoxand a highleveldevilupwardworkflowherewouldbetotake a traintensorflowmodelconvertedtoTensorflowlightformatandthenupdateyourAPStousetheTensorflowlightinterpreterusingtheappropriate A P I on.
AndwhattheywoulddohereistotaketheirtraintensorflowmodelandconvertedtocoreMLusingtheTensorflowDecorumAlconborderandthenusetheconvertedmodelwiththecoreAm l wrongtime.
Sothelastbitonperformancethat I wanttotalkaboutthiscornization.
Andthisis a goodexample, oftenoptimization, whichcutsacrossseveralcomponentsinoursystem.
Firstofall, whatisscornedization?
A simplewaytothinkaboutitisthatitreferstotechniquestostorenumbersandtoperformcalculationsonnumbersinformatsthataremorecompactthan 32 bedfloatingpointrepresentationsandwhyisthisimportant?
Well, fortworeasons.
First, modelsizeis a concernforsmalldevices, sothesmallerthemodel, thebetteritis.
A simplewayofdoingwantizationistoshrinktheweightsandbiasesaftertrading, andweareshortlygoingtobereleasing a toolwhichdeveloperscanusetoshrinkthesizeoftheirmodels.
Inadditiontothat, wehavebeenactivelyworkingondoingquantumization, a trainingtime, andthisisanactiveareaoffongoingresearch, andwhatwefindhereisthatweareabletogetaccuracyese, whicharecomparabletothefloatingpointmodelsforarchitectureslikeMobilethataswellasinception.
Andwerecentlyreleased a toolwhichallowsdeveloperstousethisandwe'reworkingonaddingsupportformoremodelsinthis.
Okay, so I talkedabout a bunchofperformanceoptimizationSze.
I willpointoutherethatifyouneedanupwhichisnotcurrentlysupported, youdohavetheoptionoffusingwhatwecall a customoffandusingthatandlateroninthistalk, Andrewwillshowyouhowyoucandothatup.
Onceyou'redonewithintensivetraining, youtypicallyhave a savemodeloryoumighthave a graphdeath.
Whatyouneedtodofirstisputthisthroughtheconverter.
Sohere I'm showinghowtodothiswithinPython.
SoifyoudownloadthenormaltensorflowTulanethat's precompiled, likethePip, you'reabletoruntheconverteranditjusttakesthesamemodeldirectoryinorfrozengraft F andityouspecify a filenameofwhatTIAflightfileyouwant, andthatwilloutput a flatbufferthat's ondiskthatyoucannowshifttowhateverdeviceyouwant.
Nowhowmuchyougetittothedevice.
Youcouldputitintothepackage.
Youcouldalsosay, distributedthrough a cloudservice, whereyoucanupdateyourmodelontheflywithoutupdatingyourcoreapplication.
Whateveryouwanttodoispossible.
Sonextonceyou'veconverted.
Well, youmightactuallyrunintosomeissuesdoingconversionbecause, um, there's a coupleofthingsthatcouldgowrong.
Sothe 1st 1 isyouneedtomakesurethatyouhave a frozengraft F orsavemodel.
Inthiscase, what I'm doingis I'm ignoringtheinpretendersand I'm puttinganoutputtenser, whichisemp.
I Now, ifyouhadinpretencesandyouwantedtomake a pretence, eryoucouldalsoreadtheinpretencesandthensayOh, multipliedbythreeandnowhave a multipliedbythreeoperation.
Thisisgonnabeapplicationdependent.
Andofcourse, as I saidbefore, youdon't alwaysneedtodothis.
Soifyouhave a modellike a classificationmodel, thatissomethingthattakesanimagein.
Whereyougoingtogetthatimage?
Well, theobviousplaceyoumightgetitwouldbefromyourdevicesstorageifit's animagefilename, butalsocommonlywouldbe a camera, whateveritmightbe, youproduce a like a bufferinthisplace.
It's gonnabe a floatSaarinenStarbufferandyoufillitintoourbuffer.
Youdon't havetodownloadoursourcecodeonevenforthetoolingpartswhereyoudotheconversionfromtensorflowdetentionoflight, youcandownloadtheprecompiledversionofofTensorflow, as I alludedtobefore.
Great.
So, uh, whatifyou'redoingIOSWell, inthatcase, youcanusethesequel, soit's a B a p I.
Youcanalsousetheobjective, seeFBI, Butagain, weprovide a precompiledbinaryintheformof a cocoapod.
Okay, sonowthatyouknowhowtousesenseoflight, I wanttotellyou a littlebitaboutwhat's gonnabecomingupintensorflowlikeonethingthatwe'vebeenaskedfor a lotisaddingmoreoperations.
AndthethirdthingwhichSarahalreadymentionedbut I'llmentionagain, isthatwe'reexcitedaboutondevicetrainingondevicetrainingisreallyexcitingbecauseitallowsustorefine a modelbasedon a user's experience.
It's a typeofmodel, anditturnsoutthatourfriendsinTensorflowresearchreleased a packagecalledObjectDetectionaspartofthetensorformodelsandthatbasicallyallowsyoutousetheirpretrainmodelthatrecognizesmanyclasses.
Sowhat I'vedoneis I wanttoloadthatonto a smalldevice.
Inthiscase, wetalked.
Weshownyou a lotofthingsaboutmobiledevices.
I wannashowyouanothersmalldevice.
Thisis a raspberrypi, Sotheraspberrypiis a reallycoolexampleof a devicebecauseit's verycheapandeasytoget.
Soanyhighschoolstudentcouldhaveoneofthese.
Youcanhavemanyoftheseandjustusethemfor a dedicatedproject.