u37ecSYq-=Q?S
zLJ&!t5+%O)NTpI6L)Ge3{O6wtknuzx9;?EqP<3xt%t9Mc6Os;Ag$J4HUw4$GVH%xl
zR&EZFC)zWP@ra{SBjw&E5pSc%bOr$>eF};_Xm~N|bCdQGasq}>p1xs*P
zNI{Cz&DW;_iwlaRy2o5d0q`4?R}2U6u$m;qK}$gt(zAD{_e<0pn1m^+LWfbS)4IL>ycgbBA$DtSP>Q%GvqXF
zx{jFVr|Vn*oA4P#y|?ey=kTu6=#S6&_$LyXf?ia~cz(CEI_XKybh`CQ7O`w&CZQaH=1<3MwYrkCTb#v4X28Dj!pipq2vVWr7`DEyzJ9Lg)!d@V)o0o#533yWjNN8BWguaw
z@k8dt)PB*2f#+_&yUjLQUS>(fd|E8+m32E7>u5+cpF4NU!7bWg62n^G3!sVDl969r
z%(1u6&q)^M{!VvsplE3f##mDR;(YBQF}bnoO)DWa62jbwlLSMPaeT#bWE~q>hlyW4
zDYy3H$=TpTI@pqBR8r<8Rc%VqfotO$w}+(tAixL-V)<6@LJm$tr6JGsu7GMrHuC;04y&V?cUV17
z{+rNWOTDA~D(t4_i&N?!PDu5_4E&1o;$8ei4wB>i-vh(ZP;C!R>W2+-J=-zD8b$P7SD-H
z*uFRSj-0c5T8o5HFo#-#zRu4mdLI(J{`l_yAbCMp^`L>_)m?tTyWb*^8#;UN%HX#&
zq}zGm-Gz{xN&$|b=j7;(wePGw-I@Cf&7|#k?Q2&J_$7Yqp@D=>%1#D*CIIV(TlAlc
z_^}HRf*x?(`)qNt)0cm~IonY{&Ea?1xagG5_ag}aKqn4z$Y$XQZ>v`d3)k#OrEyOZ
zB8mq4sCTzLyGykkhXXm4L?F^;x@)AB(x3|e6D$jZXI-|g{39T`mgp%L5fVId&B?h9
zwfrrrY&uZJ$Onh+ss5mxm-XKJw;PYskV$&`ZD)UbzJ4@oAm?crQq)6O4t9{8PlEXx
zFb}x)^L$PQ2|+E#-~@*b1fg)3{m3)4=|^I1PW@kkha+FKr(n8UpLgwamv7=q23^$D
zjs7XS>|J6l+#JYBTX$1Zkrliat0y%oG@fX=jyHAc*hp)T>Gi>SV$O;Dob5@BL@bE6
z-Q2t$7iUmO7h5y`N6Wrhg(98#Uiy(Er%d5g0Y@Ypj~WHI-565jH~3RnFn6QuZ_Zy6
zpdFu?ot;fKDj_Rs1;ztEAe90kx(UZ+%`|V-g0IPB;ZP!Q&_vR|lx
z_O^`LF5n!aD};*n|NTn__1{Y?eHwXnoxun&$wWSdT0%HHBJ7zEih5^UP)3V~ga9)2
z|A(=nKFtVfNZJj6DxAHpjjw#3n%dtp)h*V}Zw~YclLExHG5)nrTK}W(2iE6u51J33
zQ8aC1S-4q}&aqU7UKP$e<}vC9#vHByeZ^*Jxai@6-OsnRRXgc7e^)y^8u<3Qh(JzE
zQ`dbwQ&ix?1F$RaQ}3#}X7xO(Ob6{AA1joH0Z&5*aY*&*em`t$5u#O(kMQVw-0Jh;
z0!z(JsdW8X_W~bHR3IGHq4|rugU|%}C(6G~75z#o74O-b=t3^76YYItJ^!s6E4C#rfA4RVw6(`kW)h;N58;Hf;<0SH1KB@hOy4)^&5PNlf
zE;zO=8JAzU_szQjXUL@?hM%O!0K1e7fACTiW_%u3-iv(ldu?saX1LH}OSWC$uS&yY
zryz@xd(_mEC=hlXZK;pG8%9<;hn;=Rd~y&X<`cG=1@i_`<)$FPU_148eey$-t^A8y
z$=|@`jz#M2DL?e|4vXq9jEdVu$Uxz)369=QM(^QJZc|b~N3C^Vb-n?DT*Ov;A>Emq
zH+TVyr0)3F69&(S`Ox5A$XY>OcFIQZ%$Qf2v@D0law_IAa2p^((1rK>nSul_B8-rL
zjvYIejD)r;XNkSgVaEXAw!d@O(<3iiR2f>vCxdQcQziuen_8q9b7DzQ0406o)QE)z
z*$S~w<4Lg+A=S}F`rkY^YF^~=pBmcbz;-(J#3Ii<3)GJ73x0k{by*@dr*ePtL;^^t
zna&VaP5QH>ccp*X-8r0DojIVn{AH!Tzdu<9rA7IRECVOt9^k7SdnrejFQwryW=lMK
z7#!sW&J?}6whI}o^@mpcFB@|KLWXAN9X3bPSuGzfW&*%U=wNm~=ZwG%dG&HomJR?!
z>HNgBZF3k-;DCbL=QqQOB*ysk!;8Ftk)gQ|VSrX*-nz-l5oIi-0D$&Ibz9Z-hkyhI
zC;Ac?Qrg>^i=HnJfc<_I-6mlK01ekCIeHF2{nP)SHq6}w6_2J@`?vv+w~@r4_mr*weEMu;~jHz^t0pA;brxE{K)slR@%w>5p%=n;}7K)AMOjtR^Jm;I~9GJK86+@
zO?`%zpZ(U9$qN4)_$%sHY7PblYI-8{w?2)C-*&n1Q8x6Et*)m;?daAgtI^+9^@RoX
z6(?;6z7vGh9{CKS{A$u(r={fAx&W`cG1c-?sQJ!=MR}>Ohl8H02B!f)O@Xuvb9Dp&
zG|{5T0KiNS0jDDaKJj+N;w3vUe9GX4_V~Nfh9ANxHGCwcT&aL(8p(`o(ewSk5G`mv
zW%)N)dUAl*6z^9e)K#98{Sw1h@JkRY%mN^5bWj)0J07#-^3w*F@zdj(W8d8Y05Kh+
zz(f4q(CB#^x5Rm*j;Ovs!J0Hym=nnC#d^oRk>26QR?GY6@WJtVeLA9-X%=J
zc?mz}u8XQz%i)DLW8%%i)GctPxwL@ZYiX*SVCj?lSosUxqs>n+*WFf+++DgbGnAjM
zySb2`dsF!2*jY)xbU!)uwY|64MtY0<0A9cRyN;Ep(Y452xl|^kxt)b^ZGOBcYxH+MuET=ES&8
zZ5~!*)HGvvZKct&jkZ@;wCl!g|Hjg~!5Qz3*hGugw+Hz(+v@9ek8X%LHEldGcIvoO
z^^;6Yf~vK#PV0g9*ZFwm%L4MS6{SH#I-9K6c{k2>{!#CYb`6te@?M@yEerXlKZU6Q
zq7*7!I40TCzsaz&mFNMr#d7`;jNtk76ExT@BuG(QSuLroS&~=To6M0M0KTv~
zuIysHD6NBdt*$~-Fl3ALIqe*E!xg0Cp~Q-2Qs7QgOnXrWw)oe&d=(q-gwGp}fO@^5nXD*j6ReB6c+N4RElEmV
zvZ&QNJqZd9Ugg$~Rpb#acq!P)BBnyaz4)s08YK{S){(*1DA|6fpp4^E4(|ysa8_ekj)oG(-04Lt>v@TIgsOwUs%qrG9F;4n{
z#(Y>rm#bTM?gJ+1Y;c$&|M;-TB2H_(H1E|?igo5z4-}tMpS`(z+z4tlo6?Tg@jB;E
zZ;`IAa*h+howvDC=4R@Q#IGU3I1nq(+=a8TRQVs=Z$*`zA_MeTdNk7`H^v5ZQL?t(
zzkV375(1uV;}DO{Yf#tm7}Gk-Vya3`Iy2*I{iW`!-wMjA+^7JxuhV<&DC4fH>5&sz
z@7<@P=3Ty|jz?Et>^}NwLZNXd&kci5dmY0aH`99M_wA3~8)vVO0aLTyUr^D6?AqMx
zSNn9%K%MWk(d&8&>G>d30;Vc&;gw(IWWIe#38w)lV0KxJ8avVn`}g
zk|eY;e6~9aT{OI^%h=q`_bHSscU*IblJ
zj{wT*ms;v+s54(1y^bfZ;JA0jzA`OAok&hFB1f*C0(j-i6L3U4z1C@Zn}#+W;;>Ek
z=9jw2a^8#Hh%ichS?-vXvYwZeraWo!W@iAv%gb=1?nY^8aUbH&=?4@VwVx`~7PU0+
zXy13)>#x0r-3LDYtY*8~a`ZwqutELtB0FafN#1(uOlk&cD1XU2126+xf_xr8K$+rN}XVX
zJJc{4c~Sl%QTFV)@k=2-zzSv^owaH?=67(4Gywg?!>H!9C62}?x`g2s$GqF%QKe+;
zTLmYV9Cyagm*^)?a{EtcDVb?n;Mwr(?r@}y`jl}V@twok
z^m%`_uM{}f`5e}8*PLKqwqb3JjGbr(c$RT0(t#CUC?-B8FwUke20MHZux0M$%PKn=
zj0Bo~MLDfru@p+E595V*k4f)H)-|Fzx<~V0YNcbc+OIG|RCTCsZB*>BQDlc2`faoB
z&+G*2KM3luZ?;}}Fk7as!~^iwk8=?|!x0RcpUe>rg=Geo@UXpOt6*9En~0go`g@&o
z)Q;)ega9m6p)IbrP*8k8V4No8LP}t6i^Q2}WS;rY(@_6n7oOZmR0IUXx$0%5%NN^_
zzay<^FAAjf?AGjm5-dUP_N_Hkw$N!wvmcASnJ?2%Uhdno{(%$>e*uHCg!)mmln~AY
zD+30NwzwY$-+C||I)u`XnCVVaDR}2acKqGQ?vqU%qL5eIxWO$17V3parJ8HI!q`^F
ziQ!9o8hp_yywiz_`YQE5;-ty4xm+D<+w>
z+n51@YTS}Yt=#7{sC4kav^QFg6pMTL)I$mmwM#`Z~
zyNuIwm2MSaqNUl_y^D65;%_6c9B$^VXf|AxxGIi1fK`Kjcj`f_s^#y?BDmh6gyoYv
zjj7&ywjERe!u9Mfv+-fc?Y5I{>9cx*@a{tccIcasdT2dKi4jE$Opn!AIvS<+xx%Pg
zPHb-o!ckHnTT@%4#axm-1KlZ(ld={dR-E=_Q%9J8#}&WQ+---9BiAsRAe;Nxik{YL
zN%^2;-IbKGXQk!0qK(FNB6V)Xu&xv{XnDiKMsFbI`QQNX<}GfGQGcC>ABv7XX)zH~
z#ktymZ|-{DsGXhQ`HIKj7-2X7C}v@V&|#Pk`L1Yo-4B$}Mw9lum(h+)ELp>3O*+cL
z0uW>7;L!m|dhAdbg}_jK*zTbbTiC``kWI-r-HL)O6+_$VUMsw>@&xT^)%7fBhe)HY
zZ`JYTIOv>rvOeO%ht=E{`rL^5T06vU=QpQg!_~nNmQ6~>&qxQ)0olN5TD>*T-i^nx
zY+ZEBAvjk?eQ#}4C)-Eu7AU0sEy$g08niJutV0#;(dDHx0eU2JL7u!KZ>CnF7v-1b@?X>Oct^Wb~M_lH^m1tO^vNBP~cz3EoN4V-#&LoW7na
zy_vEloKgc^{ge3Xh%$!-Fvh`&7=6TkK67m7GW+c|#OCpypZ+^v|8UnK=>wesYsEUj10O6gC3i#Em`~-
zjdYe+&s^5ty7vn2Ep9}TbZ*Aj-TC;!yEpOvy69{|VNgW)M5ax5kA&x@iT)jl-#$kcwvo19({~(QH<_I#M>QFp7vJXk7WGXn@|_$C
zS-7py(pz~YJYC-yf#e-^b-4fP6x`zU`(Ddf9Z`!_64x*zx@}Pa`@^iR~nnwAWHlDeG)T
zk>Rp`__y-@nCZxFy-UUG8?)A0^S{qhX;evqb7>mTw&A$i_LQ~J8TB20I?fN9JJy5?
zmMZ0ZPy6=tJ)cd;H%e=qR@#cViw!|9f6{g&o@arsP8M+~&aGHdSRFqL4(>6v`RVg%
zh7J6PYRc?aW@Xkn&--YWR=BdsxyT1T*}Y`QCuVqs
zLToUNuVhkeJ8bk8ul`
z^#aUgC))db!YYDu8I7MJt=JBpr9ay2v3X@UqI!lq#$wVR36
zzU|UYo8Nv6O*Iia3+2-v1{D307?kQ;Xs*DaDpzk3HxnDfXxQPk>O~pH@&)03(z4}u
z1KdF73M865%>Q(c?MSbOp358<+9CgP{+(vw*X3N){0_f0F7;L4LHM$!lwT@Amg
z#b3TA45fRD@CyAMbf*h1iF&ariYO~W&a5^yY^PJ?@)DfT;21Kbz{1{1z`-4vZ
z`lZta`X8}U`pb*f2}PMjtF@60R@aSAs+aEh)H^Ya2i}qgzX#Ai7CG*Db@js0%|UQF
zwTk(RlUlEmg=RO)ZwwcQU-+=q&pF7EUvp2O%j4K`OGj3@!p(Hk&+={3nPa)F;XQ(E!(BRIp)M=98mMdP2VEqnL=sD#a(mn!XdcS
zE;s5@M9adRGAqR{uAhd8I=Q_56o-~qx=ZrAk6f$c?&IXHP_?UID=tslJ)ws*KH#Io
zY0ZwulQw3r;IR7T=19HzQC)lmy#ZcS`=SDQV5$t2FNI$<&B*A|(b(D;oXhKRQJ@uh
z{NU<4ajsc+_gzVkr<~cz9u^3*r+wYot9@TfXqHsitc)maw%W)PdQX_P7$W++b3M&@
z-hQ)Ry|QTi#g~mV9fbQ`S8kbdNKmOT;+yZO1IL{;dZ)nV#?)Hj84BL%qp`8sU(Yu=
z*SkNU3Q^OLv<2?WHx-YaetRYO7+Pmpv4o9SZn}AZMwR|tYg3zT*QZ4qNJBpdU@y5-PqU?>wXC_sqhb`^
z-N>ULsa~IUFq`1EfOA;b4T7W*-yafp*KXPngr#j!I!?Q%3bqA4vnv(_^XXw!6-2?|
zGM2pB?75+Sxw1fNOT4{rC3jv`FVX>I3N%`*;u7c<+;mKS^&E2l>>N4{j+PhgwhPE&z!8xYFmtYJE#f4<}DD)gWH_kmETdY&7KfZUt!}};|V1I=JXZ6aY
z0X4P~b%O!dlrjOz+kkv$t$%@@JWD75ohd9^pcPnR8@l!*&XSj^Au
zVQ&nXzMFF-N$=1&8T}9e|KqZrC4D(U@fMV+9Sro}Xv%<)as&D#)VHpJYssPif4d3ZX4w@|4BZJ~FO
zhFxKYn@i3o;(O6%Ldq2xUsk=7D-#_gzZvs|7yb~ku+aK&V6abL+-rhz
z_rz|`9Y}X-mcq_{N-FQ&Y+R{J-&4XT@!Ia|Ln*fweyj;{jZIrl(6_xXTGZKjR$KQy
z!ZY8fw!Sseqd&I22;a-Wno6_=Jz9Og>9moC?xo^Ow@XRPU8kE)>txYd;e=Hc=CQ$(
zuL_^KR<=DzqcB`;?Q1%!T3{1-fD(j9re?=x&z22Gz6v$skV5x)zJ5Qj#}bL3H|_JX
z^U-aF6LAK)qfavE9Ei-Lo*K(brMs3*;IaSC@(@!{-#+
z9UU?yl$7RX?BF=W{ia;LKg3tFbvtU{o)b@6=1Odv6*Y0PvKMoGG~gVyZdK>_W@pL^
zdI*Cqs6fVuiod+YgwU-$n?PK|CHDJ7?`vy0791;bhLgJwC<(?BkW<%4@JQQsd~Nl8
z+N`wKwDU-&3fs*iC1upUf=A`*lfVdymfI_)xh}@mtH5O$=j^2%h7sl6^IPGzOkoWw
zdgjL)yeeiO7^~raP+?T}%J!!e-JRTHAX-8WuD=x0e*Mm7IYvZ6TBa%NI@UWtx)JX)
z$(dUbE>Tq(uZx%;;L6%2$~rhaLT^_DC+Jf)R)v6LmAIbRevx_q5!T&L;u_4&Em>4LpUHIaMVr(gDUX6O9Iz4ynvyd+1<
zt9_G$YN55?Z^0=hxUJB1a%!~9&i~%=52WPR8D4ynL0}H$whhqL6?79zUZxx%U=i6dBw>L
zHpq8A(p3{YIpoN{@`-NnXIW7lx?Cj}bmCA)aN=oCC3bb1i@b#B3~WC~%3)b3
z5O>anzCb%X({<*c4%V0eQMt`(jQI}S`K1zX}jxsyl^eDzr?Ce$Y
z#1Zb-7vSKwo*j0`z-Zr+Io$yrx*};vj-Tvwlzk%3PKy70(my8(zH+^0K*sn+2v(_?
zg$epB?585>S@Hle^6yGaSuT(s_za-_@qn2{f%L#)fEe-jjedih;MCdfc?3?JtBxo&
zQYdK&T&$L(agXfZ#TiI)l^E|PdI7z23}O!Zb=Pn
z9QFuk_+40Y-f-66p|NM&@=LgZNPoo6OwQ-%yBkwAW7}ggnnWUSA6>cC)_X(h=^fdO
zeFnCTC2_?dRt&E?^1=~P3zv%d#L4v{N4WL);qgQsCJeG=MHvj^MQa@heHq3!N1>ZP
zN%k*isF*RKZug3-Y$dZafjDos^lKr@Zg52MeMVL|m(75@SA4{Fgz=~hVFS0p^#?;f
ztDbN=o0q43=V8w-=$f|sw!2-^wzyDSdRR#2^uC=>@+IB|sbK+$<1SZ@yNa-2mz=CM
z{7tm8y2Z9fgG~O9A&so9yMP>v@fb*9OK=;gFsN%~5af+HU#2>MQNR!i!T3Cn_J=
zGP7BWuHVuwDm{EW67Wlay~^vA%-|k@Ty7KoBkB#=J8fsib4+2%k$(JbO^`qzJ&Whp
zO%R3n75b$^p%K^nG5eA3yQ+%mV6++1V=8c>*k_>6Gpc`?#$ylS^s7z}Gagc(!6UCN
ztljQds&cqjc8MiS$l_p$BJI~$(eL=(T#sKgMdas-wP@Yxkq6awMrX|%?foKyobknVbI
zRTYIR&b2iTrbF852&d5^YD}Q4GBJm??H_1m<35!)ZJfZwE@JnegK);^2^?scY~#$!
z2>$FoHqYMV5jp*I^eAl%Nz9n-UHJYC>#q~!z$wT#lzvE6A)%8Wn~H#rAhGIKNU
zH<^o1ZgfY=L>4IBmnqbE-SJrMW`J+-tWE!R>a%dreF-}`xwq{^K
zkUq?yszYITR9kej{)~wCf9yLXX3uc7`FO@EAa1bC#KRx8xg!yGHP%g`vCkhflF@5~
z5==hdSKM_wp=}XhP=?50?bH^pk80@p=Xy>bOY1xrU(EC;6!q!XdiI@~u*sXy`M^+T
zbd;UoU|QgI``OulQNDHVU|NW3Q}f|EuKB}=oSH=XyvDUy>CYjBkjWhJ<)POJlsa1f
zsXk9|O&PKwCOPd*!DUbpWhY;eugEp>d8!Cj$~gA&uF;skI!z6+I*TA*8_Fzv$U$8(
zY_QM~-EDL(VOg@FXP0&;$w)clulH@mdJ~3`zTizf|KW$$uG(J-r7MfF=;d`uY{!o&
zhrGF_PHeh9MGId6@3Z@FD+V9kP%TzAkEqR^2~w*`jgZJ*Wm^MZ9X9EBo}{ZBNez*)
zPb3?tNDdE4S2}l5in$KzI9!3XYwpZS*QOxnkm>}S?sS>W=3I2Emf5YrOG@#Hv)VIk
zDe=R9B;JD0O>I!AXCr30%z=WQUE8K5k@G@|mcq(HiSh%)mq?^d
zFq@k5hyE8fnG*+7C-3D1700ovute|w+b?%`HxF7=a}~z1))0H{|6JOV`}CxA)zg~J~axomtlwh&;=l-rrN4M+STZMnbn1BJte<378BW+KIB?rTR
zXSel}eHAFkeL(^I&l5Bt&G{3$wQ6_ifkcy((}?-YcQo89UD<)oixowv%c7x6QE}Q;
z{OC*X-g5Nm{hkC6ia|vWQyf4(re#q%UZ2?D-JiA@p0WZ?A7njrxc|m#=M1!vUpaSH_4-hT(5ow^rRHh5#K6Qm_-SA50urR&s2&9^L^ex~o@$WsOiCX5@L_aA3e9LhMvPCo-kwNGdaaJ!t58XyMe=)7i
z2|2iY*g>_RKqRcLKlY}-Vb+*X;U3oCH{CHX>=x$q-4SO#tf?5;pe@`kly_k3rOXmG
z2nH`|C))quP0)SoDxMpO4&>==58?QZm}%6(^H{-{w5io!%s}O7kmsyhS$e
zi00(UO8QIYXF)qCR3tGFP}Z&^E~PHttx^}Rp~`z#K)NG%m)?IX?-phDQ%r_Zgp
zYV}knhOUW6I(rroAT~eQw>HzV%?Li?zLRg6vzgsp+p{O0pm?2;IRuII%o9%^qOmGh
zZ<&k_!R3IBYP!dLmZ;i>?t8Mm)9o0T6B%Otjw-D_ay~&N>s3+eoMm7r_l5*H@Irv3
z;VfZalkM(9AFbu)Vp$xXOr{vG+LT#gkEe+P3x@|;0?Y%W)v5t(A>3iX5k?NZ@l9~!6UR1uAzDhCod$P}H=MnbK
zXb~IE3tgkKOKy2`NWMQhwW05DqR5%~aMKU8{N`ES_IF48Sz*
zG?CYy=!E$C3`bNt_WDQSM(MomdcIF7+(XO|s>fuz#nJcbLZ;JY&f&Zw_Jz{ORM#(Z
zDeRGjTOYg**kYXBY;M`Y**&grQLQ^m7d#cgy13aW-*GTQ`#Y+e>RDY3$$7NIcpxt9
zt}!>N;#RH4GBJvWk;C89GYxnA!&(MbxN~TWm?7EW=?coaX^P=3I?LVo^|uUK65)HR
z%7^onb(L${`}8M=gRBD&8yKTy;9F()7U*`?2IqzqhmkS5p6DZrg-d@188!L;#hj~shO
zYOcl6r}G%5wwboc9enc-unL13zidT{aR)qoSdXu{MEdSJZbIm_RP>hVT~)@RXzs92
z-$MsveO{#BZv1{weGKPPS``z8g?%U^xUEV)^1<+}Gnyua#K%BpnMM>kOLpmw_RoM$
zG#)!tf0Z&@u%W+pjpxHwbx20gWU1?9XfK25-X?M_WlVItP9s-K*b!-qhDccDwlya-J?MzI@*LZ>K+}a;twA5;Gb&CnR=N`NWA&8X%+HyX$I(KwLZwgPmS{RJJ?*%<-bjT2K3
z8V_99RjVbAIYvv6zurzPAA8(yz)oznZIP)3snXy7W{TJ)gUv)TcAMcLZ_b`9M$Ua`
z(J%!NcAr&M7E*;Q&4fwT!T~+>M)~3)5TG^WByKN%I*|50?xZb4cE?3z=ntnWG((r6
z!5h*<-8D)*TJ$AxEIkx>&D!`Ndc~iPgbIJKj5I=N$p6A8cG1NY+41v_K{z9>FQ-@&
zHkMUs!AlQ(&F_`gmyjm0&qWUd{Pq6=ouG&3&j*KAv`xR58w#_FJlRO#XIsid*3@Z#
zprHLfxj6ZgG6B`;52XRKC7=rYp)~2q@#lzX6V(*iIay$V<@cERbi;r03Arwi=7@lI
z`5>bIe_h7>k7!O<94H;2l&3+B2KVK*yFl};vQwQZyRXeC1bTM$XZAW~-bqALJxdDx
zKOW>kW|K3cSby-ZK>st@LH$TZZ$P@WM_=0XZB{%?FswgT$8Mdj~I$!VppV{@Z<
zK6@7Hb$4|+;K11oa_w%a5@UusNinNGigp2h4vdrj72Lm)E;3^WP;0}x-VeKtdwzrr
zRJ6${zZwiF>g#=x(eE^^q>G9yb=%}*>M{K`)L;Oj2WyYM=13EeS+(X1vlokVUwr`5
zomaGNzrbraUiOK!TOrs-+hPXWzU+H+S~-u<*FD!-$`Yb=I1(4XzixLJ{`Nibz2xjC
zyXCR|gd1Y7(gXU>Jdnr_x7bO`{9eAi(EY_5i27!UVhzpeUAMz7OM-BC9apR(_^!*w
z-*6dcELFdN?(_0|2dCh8y6R{P3;VRsy7;xwO+mpgNud^!m#?A}SMG2iui%;&3foid
z(S4Ma<1g?xTalhk|DmFPU&E%Xd{}XkuIEvRWaAM!mh&
zI}1&HsIq`}jd$t(dtm&3OIF;OB;`GQrG=BU38jJ6po;?|3u%qNA8-b#_J{YJ!0EI)
zh9?zskCIfl1LC>ew^fp@_k)BQp1koq68HJi2nEhYfbQcn+HF~k0lWf}yabWi?Kj2K
zhHaqx0{0WTPg)~2(+;(l7$IKF1^ymr!>21l69d~u%Rgq1IceGi2n#=_zQ9CIDH_h{9WA0;z|3Fj?i
zRC~92R5KgtFE$2kH@1+~X!TI#lBs2Z3IR(HWER5t9KS~dCrYzf&}l*KlYjLx=-`}8^hl-=)Pn8=jELWdt<5*V_PbS
zv8s}K54R?#9XKpS&j
zh^H#RSu}WlAnAF%_+Pjt$mDlMF7Acoced#1V%Djhx#rCeUj+Emlz?S6B!Xaq6Df=w
zeEfEhYBI|-J60MYsSc)F7HwU43f4lrrQr<1B6lLJb0)x=g@bvya%VqWeSra35Vk^H
zp3$hTVZ&TjdAaVc0v-Y@nSZdkLI}(>3f$H!4p|R>FS~Fi5bt?EVb|qRJGvy)r|4kl)n1Ix5XTS|(0Ak5@ias5j=t|x+*CDZQhz@yG$cU#hFBsD
zD#30=A_w!4j4;E&HZq@eX`Y%H`YDAi`i5(FzO&9p(kEl-#m*o1^K{$_3CRJF-(Cj#
z^~ztX`4gn!DK#=G^D#`_t4fp@wp*Pxh#%y;#CfwnynylKJkhk};V4qrXK1=bsdo=iSN$6K4BQZq>9y_)Tb!uuC2`A36dJAk*~qKe
zS>sGBbiv4wms`c&VcK+Ll!#xfPjg&quMBVCUKyS+L`OY1ItWZh;+q?1|
z$UG`O%GptBEfY`efVS0lK&Q&|vckSkLOyuaEM<*b+a*mL&Fw|Y8pg{`tyS?zh95{A
zeq>Exqtr-g+4oDYK^AkO-H1D+8YSTNYbp#Oj}@9;Bu}Z4yaRr3gFEbJMjWY}@>BdZ
z1iy=y6zL{tTjOt0%%57oE`BIP3dRd*$(rON$r=EDY6hU0x_=Mg5O=1&D!}_396Y};
z(L;#fJkG>%ss&|ZDiSd|#|{k)ej+v|VQ7L|d+-?S9*_z3W$jEPUM!_d7enJrs5B-z
zqW)2>h*N;8i-|F6lP;`FJKU)RUQ(9QLP(W_Id|dEtz*q}G+)=AIUOxKl;|J!6ZfOc
z^DOCJ6+;m0(LX@KQQrG#+$QpK_TPoXa)cw`%Oew{`(FH%{8yq0Bc!Jv0YCm2kIrgE
zQMnjj=qh2a9rUBLvdC*&sIQtkA#;dlsl;Lv1JQtP*_V0DcH$
z`?Bp&__AHQ6B#N>2V~w@zx#7<;kV^gM(r24FN6cu&PUI0A`*)cm*6JS%~Y9<>&}qb
zI}tZitm!XU+^*EU>jnVt(1gt7S4uqIhC+cS%}GV-Vlk_h#hbV&g(wsvk((}OTo)|Z
zhCwhV0ky1Q)hn-ezk7D1qF3n5P6NPNdaH`3iVx1XWQ|dq5=Z@@z!GuDcCpJdi(FkS
zk^64}@P?<)2Rb^s0W?#1@Ka%b;)SKWnF$|N_*X{#>b|NuD(T$AfcJMQBfpo^Q}9$%
z0wOH`Y>oMXy8hd8%+yMQcp-sOO8Q6bmwDGXOk<4ibPQD(8&5c;K6}`=B`Sw?Nzg7l
z?)j^;UxKeYbXr%T&xEe->`1;1d55isdd~R~s5WIMt5X{G(&W=qhAl}p_f~PG@NF$Y
zXn|v8=dNNy!5Gu-a?_4!)%aGuBB9?>H4l42I2_$7aBoNe&8st8kq{S3up{s2xK<*u
zx?MYO<1PQCXQO#&=eMVLrs8Vh%hViuOfnmvW1+|dw1&8%myGJU5-77*s@HWfL3U_t
zuMO3m;jMJy#o$vyy>oH?=64g#85Ieb58}}LGn#M3au@8bko#x#vJGwqr(^4?^ZGVK
zAi(^ax2rlLs2-}y9yqJrlHELA@WZ7lHNJpCRP_+O)%g9DQJ&!l9NHaB4wNQt)
z>Y|v$4?ajsWN<)xIaS<7Jy;uO5E;x9amP#Zef&33d0lD#RAB?=NrA5{E+Sx%rc*=^
z+m&ify}Y4xKR|3#nfCJNm^`6NS>migN;t!1DU7*6#{{LQ0|Nis4&H}cd7Rj|fp%~x
z;}4XYQex9P%?$I$*J)U=J8A#`f~~YARXn;nyGI~UYZ29v>ge>{JBxbfLL4tl3|whS
z0}^p>`Y|H$n)D_Tah`o-`aElqbi4Q=&4rv(GdzLsc-4{jXs^>oH+F?
zMzE98F~F-#UZ9IFM5PGL2J4h|>86yDkff?_F)lH;t@R{re{ASJSY0m|8SGBZu#8Of
zW$TO#hA~L6`^X)#EHpG`)wgHSnKf8FBIru)8Kt&+$Hcxo&QK5je6kEYa}=mI6xSv4
zVY`?eCw5D9GK^*ppG4;585bkYUVuFoiTR99Dy=B?Wd#ey5h)H=d9bYX
zMQS^5n1VdT&53)VkI@prK#oTkm``G5@eMs2#|#x3e+d>F8~7H3GO5U&o|3qHtC@vR)N3vc#a&Eq_wBzMNK~TPv8l_K
z<2VI$-Ul&HpetNW?m;J;VE6e1*`7}G1)@<#D0iU!zLYeK?oDnz9q7k|EN?gPg}|UG
z4iq?Kv_sJJnuKDe|kVZgE#hGD4UichW<^9u_}`*ZC~7mu?|%
zRDG&fP&B$K*?lqde2s4&N`?Y>_DouMMymB(SH?3Kb-MM;U1DOXafD{DFRLyF#DikW
zLSC2wH6#|sKbe`0>|2|FM)sQ(Hy*&`-Q&Z|{UYl|6JOWyk*Sr$TJ?O@CGzujCB1aj
z(wu)IHv}qaYex7;U8l8}xnFmRIKpO{5@XbRQBeGgz<8bpot8>A4$Hz}xyNv#TS6Pc
z?+k#=v~2gK~xwG2TCPPWe){LTn4HLr_mN|r@Q(on|ETiTZ(r|ZH$^S2-q
z>A{u+0e1W;Wqi0IUbDroxNt#FmxUhe2#s2&a5S>&UI&1euey_dK+h23w3A)910hbR
z>tU1~He!h?C$X*k?9d!bU5h+FDnARvlSoKtk+I{PwysgChVT8s6M9}jAj(QkVpE>K
zn=L^);R@L8+o|l#ZO=_--#M4Kdw1whqgE;!hgF<5w(DQ1JI?%=jWfBb5uF|;w@5RX
z$njibQf&rHPYc*HEPDKoo_{VU3w8xF3wI$RRhPrq5)iYs4XMYhzk(S=qL}7M&1KzLyj&!vKgV`vbU
z$z6I?kDO)KMVFi;lixRQjMc`xf6r02m*3vZ7>o$f=o_{yuj)(L>24&4l^`D@-osSY
z^xw$J3-8|Qh1pJ+>(tn!zt*HL%bUl2!Qgp5w%=i!(R=bj6m{Hu;4nP{S$*K&s+}IS
zIl$b>p)>e)|4VFoXf;bm-kFXLi+u6O5au@t^^14YrEeL`y<#crWm}u*8L7kDC&vU7
zy=@L}efuu_cx*re_mhp2C2gEDv=!_HT|HDpGSSWR#F&(cI?+kFAOy$dGe)jSawDbL
zJI!@3ZM=?9Vk+EaP7rt0AbU2fpG-b1Wgm-ZGzmsmm%QxOL%o4WnI$)pIX>1V_}Z>$
z`+)76n7v!lunXgP+BLBayzF9rne(rW~OPt7f%+kw`M*F@hALGMZ3NG<|Rb~{l@Io7;*04HKDZC=HXjh|b;
zl~eZCrde3m=M;+HSqVU(bc=|b3qhA0Uq~n}0i$DK6VRoSDjlAj4>ewKa_RsRHX(8N!W_l=Tt>kuY
zRIv7Zt=x&-c{Hx<1-q=scz)C-5~?n3?tV^J6mRpY`)>yB5E2vJ(7!-hiDc;WVCF
zHT`a2^LF#y=$UWZI!4CZq`z5UujA*GFcdL?vdL{gcd=U({bS?Y{uvKV413tA$sP=>H4g
CwXTE!
literal 0
HcmV?d00001
diff --git a/docs/contributing/INVOCATIONS.md b/docs/contributing/INVOCATIONS.md
index 212233f497..fb3d8df3eb 100644
--- a/docs/contributing/INVOCATIONS.md
+++ b/docs/contributing/INVOCATIONS.md
@@ -1,8 +1,521 @@
# Invocations
-Invocations represent a single operation, its inputs, and its outputs. These
-operations and their outputs can be chained together to generate and modify
-images.
+Features in InvokeAI are added in the form of modular node-like systems called
+**Invocations**.
+
+An Invocation is simply a single operation that takes in some inputs and gives
+out some outputs. We can then chain multiple Invocations together to create more
+complex functionality.
+
+## Invocations Directory
+
+InvokeAI Invocations can be found in the `invokeai/app/invocations` directory.
+
+You can add your new functionality to one of the existing Invocations in this
+directory or create a new file in this directory as per your needs.
+
+**Note:** _All Invocations must be inside this directory for InvokeAI to
+recognize them as valid Invocations._
+
+## Creating A New Invocation
+
+In order to understand the process of creating a new Invocation, let us actually
+create one.
+
+In our example, let us create an Invocation that will take in an image, resize
+it and output the resized image.
+
+The first set of things we need to do when creating a new Invocation are -
+
+- Create a new class that derives from a predefined parent class called
+ `BaseInvocation`.
+- The name of every Invocation must end with the word `Invocation` in order for
+ it to be recognized as an Invocation.
+- Every Invocation must have a `docstring` that describes what this Invocation
+ does.
+- Every Invocation must have a unique `type` field defined which becomes its
+ indentifier.
+- Invocations are strictly typed. We make use of the native
+ [typing](https://docs.python.org/3/library/typing.html) library and the
+ installed [pydantic](https://pydantic-docs.helpmanual.io/) library for
+ validation.
+
+So let us do that.
+
+```python
+from typing import Literal
+from .baseinvocation import BaseInvocation
+
+class ResizeInvocation(BaseInvocation):
+ '''Resizes an image'''
+ type: Literal['resize'] = 'resize'
+```
+
+That's great.
+
+Now we have setup the base of our new Invocation. Let us think about what inputs
+our Invocation takes.
+
+- We need an `image` that we are going to resize.
+- We will need new `width` and `height` values to which we need to resize the
+ image to.
+
+### **Inputs**
+
+Every Invocation input is a pydantic `Field` and like everything else should be
+strictly typed and defined.
+
+So let us create these inputs for our Invocation. First up, the `image` input we
+need. Generally, we can use standard variable types in Python but InvokeAI
+already has a custom `ImageField` type that handles all the stuff that is needed
+for image inputs.
+
+But what is this `ImageField` ..? It is a special class type specifically
+written to handle how images are dealt with in InvokeAI. We will cover how to
+create your own custom field types later in this guide. For now, let's go ahead
+and use it.
+
+```python
+from typing import Literal, Union
+from pydantic import Field
+
+from .baseinvocation import BaseInvocation
+from ..models.image import ImageField
+
+class ResizeInvocation(BaseInvocation):
+ '''Resizes an image'''
+ type: Literal['resize'] = 'resize'
+
+ # Inputs
+ image: Union[ImageField, None] = Field(description="The input image", default=None)
+```
+
+Let us break down our input code.
+
+```python
+image: Union[ImageField, None] = Field(description="The input image", default=None)
+```
+
+| Part | Value | Description |
+| --------- | ---------------------------------------------------- | -------------------------------------------------------------------------------------------------- |
+| Name | `image` | The variable that will hold our image |
+| Type Hint | `Union[ImageField, None]` | The types for our field. Indicates that the image can either be an `ImageField` type or `None` |
+| Field | `Field(description="The input image", default=None)` | The image variable is a field which needs a description and a default value that we set to `None`. |
+
+Great. Now let us create our other inputs for `width` and `height`
+
+```python
+from typing import Literal, Union
+from pydantic import Field
+
+from .baseinvocation import BaseInvocation
+from ..models.image import ImageField
+
+class ResizeInvocation(BaseInvocation):
+ '''Resizes an image'''
+ type: Literal['resize'] = 'resize'
+
+ # Inputs
+ image: Union[ImageField, None] = Field(description="The input image", default=None)
+ width: int = Field(default=512, ge=64, le=2048, description="Width of the new image")
+ height: int = Field(default=512, ge=64, le=2048, description="Height of the new image")
+```
+
+As you might have noticed, we added two new parameters to the field type for
+`width` and `height` called `gt` and `le`. These basically stand for _greater
+than or equal to_ and _less than or equal to_. There are various other param
+types for field that you can find on the **pydantic** documentation.
+
+**Note:** _Any time it is possible to define constraints for our field, we
+should do it so the frontend has more information on how to parse this field._
+
+Perfect. We now have our inputs. Let us do something with these.
+
+### **Invoke Function**
+
+The `invoke` function is where all the magic happens. This function provides you
+the `context` parameter that is of the type `InvocationContext` which will give
+you access to the current context of the generation and all the other services
+that are provided by it by InvokeAI.
+
+Let us create this function first.
+
+```python
+from typing import Literal, Union
+from pydantic import Field
+
+from .baseinvocation import BaseInvocation, InvocationContext
+from ..models.image import ImageField
+
+class ResizeInvocation(BaseInvocation):
+ '''Resizes an image'''
+ type: Literal['resize'] = 'resize'
+
+ # Inputs
+ image: Union[ImageField, None] = Field(description="The input image", default=None)
+ width: int = Field(default=512, ge=64, le=2048, description="Width of the new image")
+ height: int = Field(default=512, ge=64, le=2048, description="Height of the new image")
+
+ def invoke(self, context: InvocationContext):
+ pass
+```
+
+### **Outputs**
+
+The output of our Invocation will be whatever is returned by this `invoke`
+function. Like with our inputs, we need to strongly type and define our outputs
+too.
+
+What is our output going to be? Another image. Normally you'd have to create a
+type for this but InvokeAI already offers you an `ImageOutput` type that handles
+all the necessary info related to image outputs. So let us use that.
+
+We will cover how to create your own output types later in this guide.
+
+```python
+from typing import Literal, Union
+from pydantic import Field
+
+from .baseinvocation import BaseInvocation, InvocationContext
+from ..models.image import ImageField
+from .image import ImageOutput
+
+class ResizeInvocation(BaseInvocation):
+ '''Resizes an image'''
+ type: Literal['resize'] = 'resize'
+
+ # Inputs
+ image: Union[ImageField, None] = Field(description="The input image", default=None)
+ width: int = Field(default=512, ge=64, le=2048, description="Width of the new image")
+ height: int = Field(default=512, ge=64, le=2048, description="Height of the new image")
+
+ def invoke(self, context: InvocationContext) -> ImageOutput:
+ pass
+```
+
+Perfect. Now that we have our Invocation setup, let us do what we want to do.
+
+- We will first load the image. Generally we do this using the `PIL` library but
+ we can use one of the services provided by InvokeAI to load the image.
+- We will resize the image using `PIL` to our input data.
+- We will output this image in the format we set above.
+
+So let's do that.
+
+```python
+from typing import Literal, Union
+from pydantic import Field
+
+from .baseinvocation import BaseInvocation, InvocationContext
+from ..models.image import ImageField, ResourceOrigin, ImageCategory
+from .image import ImageOutput
+
+class ResizeInvocation(BaseInvocation):
+ '''Resizes an image'''
+ type: Literal['resize'] = 'resize'
+
+ # Inputs
+ image: Union[ImageField, None] = Field(description="The input image", default=None)
+ width: int = Field(default=512, ge=64, le=2048, description="Width of the new image")
+ height: int = Field(default=512, ge=64, le=2048, description="Height of the new image")
+
+ def invoke(self, context: InvocationContext) -> ImageOutput:
+ # Load the image using InvokeAI's predefined Image Service.
+ image = context.services.images.get_pil_image(self.image.image_origin, self.image.image_name)
+
+ # Resizing the image
+ # Because we used the above service, we already have a PIL image. So we can simply resize.
+ resized_image = image.resize((self.width, self.height))
+
+ # Preparing the image for output using InvokeAI's predefined Image Service.
+ output_image = context.services.images.create(
+ image=resized_image,
+ image_origin=ResourceOrigin.INTERNAL,
+ image_category=ImageCategory.GENERAL,
+ node_id=self.id,
+ session_id=context.graph_execution_state_id,
+ is_intermediate=self.is_intermediate,
+ )
+
+ # Returning the Image
+ return ImageOutput(
+ image=ImageField(
+ image_name=output_image.image_name,
+ image_origin=output_image.image_origin,
+ ),
+ width=output_image.width,
+ height=output_image.height,
+ )
+```
+
+**Note:** Do not be overwhelmed by the `ImageOutput` process. InvokeAI has a
+certain way that the images need to be dispatched in order to be stored and read
+correctly. In 99% of the cases when dealing with an image output, you can simply
+copy-paste the template above.
+
+That's it. You made your own **Resize Invocation**.
+
+## Result
+
+Once you make your Invocation correctly, the rest of the process is fully
+automated for you.
+
+When you launch InvokeAI, you can go to `http://localhost:9090/docs` and see
+your new Invocation show up there with all the relevant info.
+
+![resize invocation](../assets/contributing/resize_invocation.png)
+
+When you launch the frontend UI, you can go to the Node Editor tab and find your
+new Invocation ready to be used.
+
+![resize node editor](../assets/contributing/resize_node_editor.png)
+
+# Advanced
+
+## Custom Input Fields
+
+Now that you know how to create your own Invocations, let us dive into slightly
+more advanced topics.
+
+While creating your own Invocations, you might run into a scenario where the
+existing input types in InvokeAI do not meet your requirements. In such cases,
+you can create your own input types.
+
+Let us create one as an example. Let us say we want to create a color input
+field that represents a color code. But before we start on that here are some
+general good practices to keep in mind.
+
+**Good Practices**
+
+- There is no naming convention for input fields but we highly recommend that
+ you name it something appropriate like `ColorField`.
+- It is not mandatory but it is heavily recommended to add a relevant
+ `docstring` to describe your input field.
+- Keep your field in the same file as the Invocation that it is made for or in
+ another file where it is relevant.
+
+All input types a class that derive from the `BaseModel` type from `pydantic`.
+So let's create one.
+
+```python
+from pydantic import BaseModel
+
+class ColorField(BaseModel):
+ '''A field that holds the rgba values of a color'''
+ pass
+```
+
+Perfect. Now let us create our custom inputs for our field. This is exactly
+similar how you created input fields for your Invocation. All the same rules
+apply. Let us create four fields representing the _red(r)_, _blue(b)_,
+_green(g)_ and _alpha(a)_ channel of the color.
+
+```python
+class ColorField(BaseModel):
+ '''A field that holds the rgba values of a color'''
+ r: int = Field(ge=0, le=255, description="The red channel")
+ g: int = Field(ge=0, le=255, description="The green channel")
+ b: int = Field(ge=0, le=255, description="The blue channel")
+ a: int = Field(ge=0, le=255, description="The alpha channel")
+```
+
+That's it. We now have a new input field type that we can use in our Invocations
+like this.
+
+```python
+color: ColorField = Field(default=ColorField(r=0, g=0, b=0, a=0), description='Background color of an image')
+```
+
+**Extra Config**
+
+All input fields also take an additional `Config` class that you can use to do
+various advanced things like setting required parameters and etc.
+
+Let us do that for our _ColorField_ and enforce all the values because we did
+not define any defaults for our fields.
+
+```python
+class ColorField(BaseModel):
+ '''A field that holds the rgba values of a color'''
+ r: int = Field(ge=0, le=255, description="The red channel")
+ g: int = Field(ge=0, le=255, description="The green channel")
+ b: int = Field(ge=0, le=255, description="The blue channel")
+ a: int = Field(ge=0, le=255, description="The alpha channel")
+
+ class Config:
+ schema_extra = {"required": ["r", "g", "b", "a"]}
+```
+
+Now it becomes mandatory for the user to supply all the values required by our
+input field.
+
+We will discuss the `Config` class in extra detail later in this guide and how
+you can use it to make your Invocations more robust.
+
+## Custom Output Types
+
+Like with custom inputs, sometimes you might find yourself needing custom
+outputs that InvokeAI does not provide. We can easily set one up.
+
+Now that you are familiar with Invocations and Inputs, let us use that knowledge
+to put together a custom output type for an Invocation that returns _width_,
+_height_ and _background_color_ that we need to create a blank image.
+
+- A custom output type is a class that derives from the parent class of
+ `BaseInvocationOutput`.
+- It is not mandatory but we recommend using names ending with `Output` for
+ output types. So we'll call our class `BlankImageOutput`
+- It is not mandatory but we highly recommend adding a `docstring` to describe
+ what your output type is for.
+- Like Invocations, each output type should have a `type` variable that is
+ **unique**
+
+Now that we know the basic rules for creating a new output type, let us go ahead
+and make it.
+
+```python
+from typing import Literal
+from pydantic import Field
+
+from .baseinvocation import BaseInvocationOutput
+
+class BlankImageOutput(BaseInvocationOutput):
+ '''Base output type for creating a blank image'''
+ type: Literal['blank_image_output'] = 'blank_image_output'
+
+ # Inputs
+ width: int = Field(description='Width of blank image')
+ height: int = Field(description='Height of blank image')
+ bg_color: ColorField = Field(description='Background color of blank image')
+
+ class Config:
+ schema_extra = {"required": ["type", "width", "height", "bg_color"]}
+```
+
+All set. We now have an output type that requires what we need to create a
+blank_image. And if you noticed it, we even used the `Config` class to ensure
+the fields are required.
+
+## Custom Configuration
+
+As you might have noticed when making inputs and outputs, we used a class called
+`Config` from _pydantic_ to further customize them. Because our inputs and
+outputs essentially inherit from _pydantic_'s `BaseModel` class, all
+[configuration options](https://docs.pydantic.dev/latest/usage/schema/#schema-customization)
+that are valid for _pydantic_ classes are also valid for our inputs and outputs.
+You can do the same for your Invocations too but InvokeAI makes our life a
+little bit easier on that end.
+
+InvokeAI provides a custom configuration class called `InvocationConfig`
+particularly for configuring Invocations. This is exactly the same as the raw
+`Config` class from _pydantic_ with some extra stuff on top to help faciliate
+parsing of the scheme in the frontend UI.
+
+At the current moment, tihs `InvocationConfig` class is further improved with
+the following features related the `ui`.
+
+| Config Option | Field Type | Example |
+| ------------- | ------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------- |
+| type_hints | `Dict[str, Literal["integer", "float", "boolean", "string", "enum", "image", "latents", "model", "control"]]` | `type_hint: "model"` provides type hints related to the model like displaying a list of available models |
+| tags | `List[str]` | `tags: ['resize', 'image']` will classify your invocation under the tags of resize and image. |
+| title | `str` | `title: 'Resize Image` will rename your to this custom title rather than infer from the name of the Invocation class. |
+
+So let us update your `ResizeInvocation` with some extra configuration and see
+how that works.
+
+```python
+from typing import Literal, Union
+from pydantic import Field
+
+from .baseinvocation import BaseInvocation, InvocationContext, InvocationConfig
+from ..models.image import ImageField, ResourceOrigin, ImageCategory
+from .image import ImageOutput
+
+class ResizeInvocation(BaseInvocation):
+ '''Resizes an image'''
+ type: Literal['resize'] = 'resize'
+
+ # Inputs
+ image: Union[ImageField, None] = Field(description="The input image", default=None)
+ width: int = Field(default=512, ge=64, le=2048, description="Width of the new image")
+ height: int = Field(default=512, ge=64, le=2048, description="Height of the new image")
+
+ class Config(InvocationConfig):
+ schema_extra: {
+ ui: {
+ tags: ['resize', 'image'],
+ title: ['My Custom Resize']
+ }
+ }
+
+ def invoke(self, context: InvocationContext) -> ImageOutput:
+ # Load the image using InvokeAI's predefined Image Service.
+ image = context.services.images.get_pil_image(self.image.image_origin, self.image.image_name)
+
+ # Resizing the image
+ # Because we used the above service, we already have a PIL image. So we can simply resize.
+ resized_image = image.resize((self.width, self.height))
+
+ # Preparing the image for output using InvokeAI's predefined Image Service.
+ output_image = context.services.images.create(
+ image=resized_image,
+ image_origin=ResourceOrigin.INTERNAL,
+ image_category=ImageCategory.GENERAL,
+ node_id=self.id,
+ session_id=context.graph_execution_state_id,
+ is_intermediate=self.is_intermediate,
+ )
+
+ # Returning the Image
+ return ImageOutput(
+ image=ImageField(
+ image_name=output_image.image_name,
+ image_origin=output_image.image_origin,
+ ),
+ width=output_image.width,
+ height=output_image.height,
+ )
+```
+
+We now customized our code to let the frontend know that our Invocation falls
+under `resize` and `image` categories. So when the user searches for these
+particular words, our Invocation will show up too.
+
+We also set a custom title for our Invocation. So instead of being called
+`Resize`, it will be called `My Custom Resize`.
+
+As simple as that.
+
+As time goes by, InvokeAI will further improve and add more customizability for
+Invocation configuration. We will have more documentation regarding this at a
+later time.
+
+# **[TODO]**
+
+## Custom Components For Frontend
+
+Every backend input type should have a corresponding frontend component so the
+UI knows what to render when you use a particular field type.
+
+If you are using existing field types, we already have components for those. So
+you don't have to worry about creating anything new. But this might not always
+be the case. Sometimes you might want to create new field types and have the
+frontend UI deal with it in a different way.
+
+This is where we venture into the world of React and Javascript and create our
+own new components for our Invocations. Do not fear the world of JS. It's
+actually pretty straightforward.
+
+Let us create a new component for our custom color field we created above. When
+we use a color field, let us say we want the UI to display a color picker for
+the user to pick from rather than entering values. That is what we will build
+now.
+
+---
+
+# OLD -- TO BE DELETED OR MOVED LATER
+
+---
## Creating a new invocation
From 82978d3ee5159a3eb8f4dc1cd85539916f351f56 Mon Sep 17 00:00:00 2001
From: Kent Keirsey <31807370+hipsterusername@users.noreply.github.com>
Date: Thu, 6 Jul 2023 11:28:21 -0400
Subject: [PATCH 6/8] Update Combinatorial Setting Information
---
docs/features/PROMPTS.md | 5 +++++
1 file changed, 5 insertions(+)
diff --git a/docs/features/PROMPTS.md b/docs/features/PROMPTS.md
index ae269250e7..1fd4550493 100644
--- a/docs/features/PROMPTS.md
+++ b/docs/features/PROMPTS.md
@@ -331,6 +331,11 @@ For example, the following prompts could be generated from the above Dynamic Pro
A cottage in winter designed in style2, style3
And many more!
+When the `Combinatorial` setting is on, Invoke will disable the "Images" selection, and generate every combination up until the setting for Max Prompts is reached.
+When the `Combinatorial` setting is off, Invoke will randomly generate combinations up until the setting for Images has been reached.
+
+
+
### Tips and Tricks for Using Dynamic Prompts
Below are some useful strategies for creating Dynamic Prompts:
From 2eddd5db7d0864197d0515f4e8a0527586283998 Mon Sep 17 00:00:00 2001
From: Kent Keirsey <31807370+hipsterusername@users.noreply.github.com>
Date: Thu, 6 Jul 2023 11:52:49 -0400
Subject: [PATCH 7/8] Update and rename TEXTUAL_INVERSION.md to TRAINING.md
---
.../{TEXTUAL_INVERSION.md => TRAINING.md} | 15 +++------------
1 file changed, 3 insertions(+), 12 deletions(-)
rename docs/features/{TEXTUAL_INVERSION.md => TRAINING.md} (94%)
diff --git a/docs/features/TEXTUAL_INVERSION.md b/docs/features/TRAINING.md
similarity index 94%
rename from docs/features/TEXTUAL_INVERSION.md
rename to docs/features/TRAINING.md
index 8f4aa5b167..41197a334f 100644
--- a/docs/features/TEXTUAL_INVERSION.md
+++ b/docs/features/TRAINING.md
@@ -1,9 +1,10 @@
---
-title: Textual-Inversion
+title: Training
---
-# :material-file-document: Textual Inversion
+# :material-file-document: Training
+# Textual Inversion Training
## **Personalizing Text-to-Image Generation**
You may personalize the generated images to provide your own styles or objects
@@ -258,16 +259,6 @@ invokeai-ti \
--only_save_embeds
```
-## Using Embeddings
-
-After training completes, the resultant embeddings will be saved into your `$INVOKEAI_ROOT/embeddings//learned_embeds.bin`.
-
-These will be automatically loaded when you start InvokeAI.
-
-Add the trigger word, surrounded by angle brackets, to use that embedding. For example, if your trigger word was `terence`, use `` in prompts. This is the same syntax used by the HuggingFace concepts library.
-
-**Note:** `.pt` embeddings do not require the angle brackets.
-
## Troubleshooting
### `Cannot load embedding for . It was trained on a model with token dimension 1024, but the current model has token dimension 768`
From 75b28eb79b0631d2e5a3de4743bc31efd53674f4 Mon Sep 17 00:00:00 2001
From: Kent Keirsey <31807370+hipsterusername@users.noreply.github.com>
Date: Thu, 6 Jul 2023 12:22:52 -0400
Subject: [PATCH 8/8] Update CONCEPTS.md
---
docs/features/CONCEPTS.md | 92 +++++++++++++++------------------------
1 file changed, 35 insertions(+), 57 deletions(-)
diff --git a/docs/features/CONCEPTS.md b/docs/features/CONCEPTS.md
index 2d09db3de4..d9988b60ba 100644
--- a/docs/features/CONCEPTS.md
+++ b/docs/features/CONCEPTS.md
@@ -1,9 +1,12 @@
---
-title: Concepts Library
+title: Concepts
---
# :material-library-shelves: The Hugging Face Concepts Library and Importing Textual Inversion files
+With the advances in research, many new capabilities are available to customize the knowledge and understanding of novel concepts not originally contained in the base model.
+
+
## Using Textual Inversion Files
Textual inversion (TI) files are small models that customize the output of
@@ -12,18 +15,16 @@ and artistic styles. They are also known as "embeds" in the machine learning
world.
Each TI file introduces one or more vocabulary terms to the SD model. These are
-known in InvokeAI as "triggers." Triggers are often, but not always, denoted
-using angle brackets as in "<trigger-phrase>". The two most common type of
+known in InvokeAI as "triggers." Triggers are denoted using angle brackets
+as in "<trigger-phrase>". The two most common type of
TI files that you'll encounter are `.pt` and `.bin` files, which are produced by
different TI training packages. InvokeAI supports both formats, but its
-[built-in TI training system](TEXTUAL_INVERSION.md) produces `.pt`.
+[built-in TI training system](TRAINING.md) produces `.pt`.
The [Hugging Face company](https://huggingface.co/sd-concepts-library) has
amassed a large ligrary of >800 community-contributed TI files covering a
-broad range of subjects and styles. InvokeAI has built-in support for this
-library which downloads and merges TI files automatically upon request. You can
-also install your own or others' TI files by placing them in a designated
-directory.
+broad range of subjects and styles. You can also install your own or others' TI files
+by placing them in the designated directory for the compatible model type
### An Example
@@ -41,66 +42,43 @@ You can also combine styles and concepts:
| :--------------------------------------------------------: |
| ![](../assets/concepts/image5.png) |
-## Using a Hugging Face Concept
-!!! warning "Authenticating to HuggingFace"
-
- Some concepts require valid authentication to HuggingFace. Without it, they will not be downloaded
- and will be silently ignored.
-
- If you used an installer to install InvokeAI, you may have already set a HuggingFace token.
- If you skipped this step, you can:
-
- - run the InvokeAI configuration script again (if you used a manual installer): `invokeai-configure`
- - set one of the `HUGGINGFACE_TOKEN` or `HUGGING_FACE_HUB_TOKEN` environment variables to contain your token
-
- Finally, if you already used any HuggingFace library on your computer, you might already have a token
- in your local cache. Check for a hidden `.huggingface` directory in your home folder. If it
- contains a `token` file, then you are all set.
-
-
-Hugging Face TI concepts are downloaded and installed automatically as you
-require them. This requires your machine to be connected to the Internet. To
-find out what each concept is for, you can browse the
-[Hugging Face concepts library](https://huggingface.co/sd-concepts-library) and
-look at examples of what each concept produces.
-
-To load concepts, you will need to open the Web UI's configuration
-dialogue and activate "Show Textual Inversions from HF Concepts
-Library". This will then add a list of HF Concepts to the dropdown
-"Add Textual Inversion" menu. Select the concept(s) of your choice and
-they will be incorporated into the positive prompt. A few concepts are
-designed for the negative prompt, in which case you can add them to
-the negative prompt box by select the down arrow icon next to the
-textual inversion menu.
-
-There are nearly 1000 HF concepts, more than will fit into a menu. For
-this reason we only show the most popular concepts (those which have
-received 5 or more likes). If you wish to use a concept that is not on
-the list, you may simply type its name surrounded by brackets. For
-example, to load the concept named "xidiversity", add ``
-to the positive or negative prompt text.
## Installing your Own TI Files
You may install any number of `.pt` and `.bin` files simply by copying them into
-the `embeddings` directory of the InvokeAI runtime directory (usually `invokeai`
-in your home directory). You may create subdirectories in order to organize the
-files in any way you wish. Be careful not to overwrite one file with another.
+the `embedding` directory of the corresponding InvokeAI models directory (usually `invokeai`
+in your home directory). For example, you can simply move a Stable Diffusion 1.5 embedding file to
+the `sd-1/embedding` folder. Be careful not to overwrite one file with another.
For example, TI files generated by the Hugging Face toolkit share the named
-`learned_embedding.bin`. You can use subdirectories to keep them distinct.
+`learned_embedding.bin`. You can rename these, or use subdirectories to keep them distinct.
-At startup time, InvokeAI will scan the `embeddings` directory and load any TI
-files it finds there. At startup you will see a message similar to this one:
+At startup time, InvokeAI will scan the various `embedding` directories and load any TI
+files it finds there for compatible models. At startup you will see a message similar to this one:
```bash
>> Current embedding manager terms: ,
```
+To use these when generating, simply type the `<` key in your prompt to open the Textual Inversion WebUI and
+select the embedding you'd like to use. This UI has type-ahead support, so you can easily find supported embeddings.
-The terms you can use will appear in the "Add Textual Inversion"
-dropdown menu above the HF Concepts.
+## Using LoRAs
-## Further Reading
+LoRA files are models that customize the output of Stable Diffusion image generation.
+Larger than embeddings, but much smaller than full models, they augment SD with improved
+understanding of subjects and artistic styles.
+
+Unlike TI files, LoRAs do not introduce novel vocabulary into the model's known tokens. Instead,
+LoRAs augment the model's weights that are applied to generate imagery. LoRAs may be supplied
+with a "trigger" word that they have been explicitly trained on, or may simply apply their
+effect without being triggered.
+
+LoRAs are typically stored in .safetensors files, which are the most secure way to store and transmit
+these types of weights. You may install any number of `.safetensors` LoRA files simply by copying them into
+the `lora` directory of the corresponding InvokeAI models directory (usually `invokeai`
+in your home directory). For example, you can simply move a Stable Diffusion 1.5 LoRA file to
+the `sd-1/lora` folder.
+
+To use these when generating, open the LoRA menu item in the options panel, select the LoRAs you want to apply
+and ensure that they have the appropriate weight recommended by the model provider. Typically, most LoRAs perform best at a weight of .75-1.
-Please see [the repository](https://github.com/rinongal/textual_inversion) and
-associated paper for details and limitations.