1 minute read

What I did

  • Monday: Finished script to generate thousands of real-disordered structures.
  • Tuesday: Wrote a script to loop through the results from the simulations I submitted yesterday and produced particle averaged spectra. I finished writing the script but I haven’t tested it yet because the job is still running. I also submitted the files to the cluster to run.
  • Wednesday: Day off. Helped grandmother move. Still waiting for cluster results.
  • Thursday: Created a nice plotly figure comparing the particle-averaged FEFF simulations to the gaussian-averaged ones.
  • Friday: Wrote python script to loop through all simulation structures and calculate the actual MSRD. Creates file fixed-cipher.csv
  • Saturday: Thought about how to go about making the \(g(r)\) figure. Worked on running the particle-averaging.py script.
  • Sunday: Neural Network compile and fit working. I’ve only built it for linear regression right now, but the code is working such that I can start to build up a more complex network.

What I learned

  • Monday: Turns out the MSD values I had measured were off by a factor of 10 because they were being square-rooted an extra time.
  • Tuesday: I learned about pivot tables and how they can be useful. Pandas dataframes have a df.pivot_table command that might just do what you want when df.group_by doesn’t work.
  • Wednesday: -
  • Thursday: The plots reveal my gaussian method might work after all. Since I have both datasets now, there’s no reason I can’t make two neural networks, or train it with one or the other and see if they work. See what I will do next
  • Friday: Some of the anomalies might be because the the greater disorder causes a sort of buckling. It will be coll to create a \(g(r)\) plot of each disordered structure. I need to think about how to go about this, specifically with binning. Will bring it up next tuesday meeting.
  • Sunday: Tensorboard integration in VS Code Insiders isn’t working sometimes. I got it to work by launching it in Conda though and going to the localhost page in a browser.

What I will do next

  • I like the idea of training it with the gaussian-averaged training data and then using the network to predict the results of the particle-averaged data.
  • I want to figure out why tensorboard integration is being annoying. I also want to dig deeper into the tensorboard map to understand the compute graph better.