Camille Cobb ’12 is in the November edition of Computing Research News. Check out her picture on the back page of her presenting her research poster with co-author Katie Baldwin from the University of Delaware at the Grace Hopper Celebration of Women in Computing.
Emily Gibson Hill from the University of Delaware will give a talk on applying natural-language analysis to understanding large software systems.
Developing Natural Language-based Software Analyses & Tools to Expedite Software Maintenance
Friday, December 4, 11:15 a.m.
Pizza lunch to follow
Abstract: Today’s software is large and complex, with systems consisting of millions of lines of code. New developers to a software project face significant challenges in locating code related to their maintenance tasks of fixing bugs or adding new features. Developers can simply be handed a bug and told to fix it–even when they have no idea where to begin.
We can significantly reduce the cost of software maintenance by reducing the time and effort to find and understand code. In this talk, I will outline the challenges in finding and understanding code in a large software project as well as present some software engineering tools that can help. Specifically, I will present techniques that leverage the textual information from comments and identifiers as well as program structure to create intuitive software engineering and program comprehension tools.
Bio: Emily Hill is a PhD candidate at the University of Delaware in Computer and Information Sciences. While an undergraduate at a liberal arts college, Emily researched information retrieval systems, which influenced her thesis topic. Her interdisciplinary thesis focuses on developing natural language processing and information retrieval techniques to improve software engineering tools. Emily spends much of her research time analyzing the natural language clues developers
leave behind in identifiers and comments. Outside of research, Emily enjoys singing opera, fantasy football, and reading.
The U.S. House of Representatives passed a resolution to raise the profile of computer science as a transforming industry that drives technology innovation and bolsters economic productivity. The resolution, H. RES. 558, designates the week of December 7 as “National Computer Science Education Week” in honor of Grace Murray Hopper, one of the outstanding pioneers in the field of computer science, who was born on December 9, 1906.
Camille Cobb ’12 presented a research poster at the Grace Hopper Celebration of Women in Computing. The poster entitled “Exploring Data Models for Automatically Generating Tests for Web Applications” is co-authored with Carrie Hopkins ’12 and Professor Sara Sprenkle as well as Katie Baldwin ’10 and Professor Lori Pollock from the University of Delaware.
Many people came to talk to Camille and Katie about their poster, including alumna Anne Van Devender ’09 and the CTO of Amazon Werner Vogels, who especially encouraged them to continue their research.
Career Services is hosting a panel discussion, focusing on the sciences. Alumni speakers will represent a variety of career industries on each panel. If you have considered majors and/or careers in these areas, come hear how major choice at W&L has played into the career decision-making and planning of these individuals. Take advantage of this unique opportunity to hear from and interact with multiple alums in your field of interest!
When: Friday, October 7, 5 p.m.
Where: Science Center G-14
There will be an informal reception immediately following the panel in the Great Hall of the Science Center.
Chris Diebold ‘09
Chris graduated from W&L with a B.S. in chemistry in June 2009.
Dave Passavant ‘99
Dave graduated from W&L in ’99 with a double major in Computer Science and Business Administration. Dave has recently started a position as Director of Business Design at the University of Pittsburgh Medical Center.
Abby Perdue ‘04
A graduate of Washington and Lee University, majoring in biology and English, and the University of Virginia School of Law.
Virginia Behr ‘97
Ombudsman, Center for Drug Evaluation and Research, Food and Drug Administration U.S. Department of Health and Human Services
Elizabeth DeStefano ‘99
Elizabeth has a Bachelors degree in Psychology and Biology from Washington & Lee University and a Masters degree in Public Health Communications and Marketing from George Washington University. Elizabeth has been with the Washington Regional Transplant Community (WRTC) since the beginning of 2002.
Three computer science students presented at the Summer Research Showcase held in Leyburn Library on Monday, October 5.
Will Richardson ’11 presented his work “NuclearEd: A Powerful Resource for Educators.” Will collaborated on the project with Chemistry professor Frank Settle and Computer Science professor Tom Whaley.
Camille Cobb ’12 and Carrie Hopkins ’12 presented their work on “Exploring Data Models for Automatically Generating Tests for Web Applications.” Camille and Carrie collaborated with Computer Science professor Sara Sprenkle as well as Katie Baldwin ’10 and Professor Lori Pollock from the University of Delaware.
Click on the thumbnails above to view larger images.
Congratulations CS class of 2009! Way to go and good luck.
A Parallel Algorithm for Derivation of Regression Coefficients on the Graphics Processing Unit
Pasko Shterev Paskov
Regression analysis is one of the most common methods of statistical inference, finding its roots into scientific research from all areas for more than two centuries. It is used widely due to its intuitive way to establish a relationship between observations of different variables, and therefore provide empirical proof for a hypothetical connection, or dependence, between them. Regression is an invaluable tool for both research and commerce alike, and has understandably received much attention from software companies in the past two decades, as they realized the immense potential of computers to improve and facilitate the use of the method. Although the contribution of such software to the use of regression should not be understated, the massive amounts of information that have become available with the rise of the digital age has made it increasingly more time consuming, and at instances near impossible, for machines to derive the estimated coefficients of regression. This is a very computationally intensive problem, and improving the efficiency of the algorithm is crucial to time-sensitive applications of regression. The series of graphics cards introduced in the past two years has found wide recognition as providing an accessible alternative to parallel computer clusters for many applications. The architecture and parallel capabilities of the GPU entail a great potential for an improvement of regression analysis calculations. This thesis introduces a new parallel regression algorithm in CUDA for use on the GPU, and demonstrates that this algorithm is between four times faster for smaller datasets and six hundred times faster for larger, depending also on the GPU architecture.
A Parallel Algorithm for Fast Edge Detection on the Graphics Processing Unit
Often, it is a race against time to make a proper diagnosis of a disease. In areas of the world where qualified medical personnel are scarce, work is being done on the automated diagnosis of illnesses. Automated diagnosis involves several stages of image processing on lab samples in search of abnormalities that may indicate the presence of such things as tuberculosis. These imageprocessing tasks are good candidates for migration to parallelism which would significantly speed up the process. However, a traditional parallel computer is not a very accessible piece of hardware to many. The graphics processing unit (GPU) has evolved into a highly parallel component that recently has gained the ability to be utilized by developers for non-graphical computations.
This paper demonstrates the parallel computing power of the GPU in the area of medical image processing. We present a new algorithm for performing edge detection on images using NVIDIA’s CUDA programming model in order to program the GPU in C. We evaluated our algorithm on a number of sample images and compared it to two other implementations; one sequential and one parallel. This new algorithm produces impressive speedup in the edge detection process.