The seed selection paper is quite interesting; I actually also looked into optimizing the noise seed for some diffusion model latent space exploration for a course project for a bit and it's kind of interesting to see how images change over the seeds.

One low hanging fruit after skimming the arxiv is simply leveraging the fact that you can probably also speed up the optimization process by not going through the entire diffusion process, i.e. (1) you can just do one step generation where you set the number of inference steps = 1, and it should be a reasonable but not necessarily pretty image that regresses to the mean of the distribution or (2) you can use the predicted x0 from the first few timesteps which is again not necessarily pretty but probably gives enough signal to optimize over. Kind of also wondering if you can speed up arbitrary test-time optimization techniques (i.e. textual inversion, null text inversion) with a trick like this.

The seed selection paper is quite interesting; I actually also looked into optimizing the noise seed for some diffusion model latent space exploration for a course project for a bit and it's kind of interesting to see how images change over the seeds.

One low hanging fruit after skimming the arxiv is simply leveraging the fact that you can probably also speed up the optimization process by not going through the entire diffusion process, i.e. (1) you can just do one step generation where you set the number of inference steps = 1, and it should be a reasonable but not necessarily pretty image that regresses to the mean of the distribution or (2) you can use the predicted x0 from the first few timesteps which is again not necessarily pretty but probably gives enough signal to optimize over. Kind of also wondering if you can speed up arbitrary test-time optimization techniques (i.e. textual inversion, null text inversion) with a trick like this.

Cool idea, Grace! And thanks for being the first subscriber to comment on one of my posts!! :)