More Functions in LabVIEW! (Week 6)
Monday: Today I did more LabVIEW programming. In the functions palette, I found a function called “Trigger and Gate.” After reading what this function did exactly, I programmed it to find the area under a specified peak instead of finding the area under the curve for the entire iteration. Once you specify a certain voltage amplitude for the program to trigger, it will do what you tell it to do for that trigger. In my case, I am telling it to trigger above a certain threshold, so that I will get the area under each peak that it triggers. This way, I can see if the area under the peak corresponds to anything with the particle, such as shape or size. The settings with this function, however, can be very picky with what it tries to detect. Initially, it wasn’t working how I wanted it to, but once I figured out the optimum settings, it started to work how I wanted it to. Now I have to decide whether it is useful enough to integrate into the main LabVIEW VI.
Tuesday: Today I ran one of the versions of my LabVIEW program on all of the sets of data that has been recorded to be saved so that it can be analyzed further. I have about 23 different sets of raw data. I figured that although the program isn’t perfect, it is better to have some analyzed data saved and formatted nicely so that I can go back and retrieve it if something should go wrong later on. I also organized all of the files on my computer and flash drive and made sure that I have every file on each of the computers that I am working between. Also, when I transfer data from one computer to another, some of the filenames don’t switch, so I have to hunt them out and change them manually or else no data comes through.
Wednesday: Today I found and added yet another function to the LabVIEW program. This function will take a subset of the total data (which is over a million data points) and analyze that portion separately. This way, we won’t lose as much data when exporting to excel and it is a lot easier to work with—memory of the computer and such. Now it is exporting 2 different files to excel—one of the overall data and one that is the subset. Adding the subset function to the program and wiring it to perform all of the other functions (such as peak detection) slowed the program down a lot. It took about 30 minutes to analyze about 2 minutes worth of data. I also created histograms for all of the previous 23 runs. Lastly, I met with my principal investigator, Joe Needoba, as he will be away from CMOP for the next week.
Thursday: Today I looked through a set of notes on LabVIEW that Rachel Golda gave me to see if there was anything else that I could possibly add to my program to make it more useful and efficient at analyzing the analog input signal. I also ran another sample taken from the Beaver Army Terminal through the uFCM. It had to be filtered using a 100 micron filter so that when it was run through the uFCM, it wouldn’t clog the flow cell.
Friday: Today all of the interns took a scheduled trip to the Bonneville Dam where we received a tour of the fish ladders and the one of the powerhouses. We also went to the Bonneville Fish Hatchery, the Vista House (view point of the Columbia River), and Multnomah Falls. It was a nice way to end the work week.