Damian at mba
54e6a68acb
wip bringing cross-attention to PLMS and DDIM
2022-10-19 21:08:03 +02:00
Damian at mba
09f62032ec
cleanup and clarify comments
2022-10-19 21:08:03 +02:00
Damian at mba
711ffd238f
cleanup
2022-10-19 21:08:03 +02:00
Damian at mba
056cb0d8a8
sliced cross-attention wrangler works
2022-10-19 21:08:03 +02:00
Damian at mba
37a204324b
go back to using InvokeAI attention
2022-10-19 21:08:03 +02:00
Damian at mba
1fc1f8bf05
cross-attention working with placeholder {} syntax
2022-10-19 21:06:42 +02:00
Damian at mba
8ff507b03b
runs but doesn't work properly - see below for test prompt
...
test prompt:
"a cat sitting on a car {a dog sitting on a car}" -W 384 -H 256 -s 10 -S 12346 -A k_euler
note that substition of dog for cat is currently hard-coded (ksampler.py
line 43-44)
2022-10-19 21:06:42 +02:00
Damian at mba
33d6603fef
cleanup initial experiments
2022-10-19 21:06:42 +02:00