Skip Navigation

UNL News Blog

Archive for July, 2012

UNL political science experts on the 2012 election

Monday, July 30th, 2012

As we enter the final weeks before the 2012 election, several University of Nebraska-Lincoln political science professors can discuss the presidential, U.S. Senate and other campaigns.

John Hibbing, Foundation Regents University Professor of Political Science

American politics, U.S. Senate race, Congress

Hibbing is a nationally known expert in political psychology, biology and politics, political behavior, public opinion and legislative politics. For reporters, he can provide insight into this year’s national and statewide campaigns, including the races for U.S. Senate in Nebraska and the presidential campaign, and can provide reaction and analysis on campaign-trail developments.

Reach John Hibbing at 402-472-3220 or

* * *

Kevin Smith, professor of political science

American politics, U.S. Senate race, Presidential race, political messaging

Smith focuses on public policy, public administration, American politics, and biology and politics. He can discuss the dynamics of this year’s U.S. Senate race and other major races, including the presidential campaign. He can analyze broad aspects of these campaigns, including the effectiveness or lack thereof of political advertising. He also can discuss differences between liberals, conservatives and moderates in the context of the 2012 election, and how developments on the campaign trail may be interpreted by these different groups of voters.

Reach Kevin Smith at 402-472-0779 or

* * *

Dona-Gene Mitchell, assistant professor of political science

Public opinion and effects of campaign information or scandal over time

Mitchell’s expertise is in American political behavior, public opinion and political psychology. She researches and teaches in the areas of how opinions are formed via information, campaigns and time, and the lifespan of information effects. She can discuss the effectiveness over time of campaign messaging or how long unfavorable information may affect politicians and elected officials.

Reach Dona-Gene Mitchell at 402-472-5994 or

* * *

Elizabeth Theiss-Morse, Willa Cather Professor and Chair of Political Science

Public opinion, political behavior, political psychology

Theiss-Morse’s research examines Americans’ attitudes about various aspects of the American political system and about their fellow Americans. She is currently working on a project on politicians’ use of heated rhetoric and how this affects democracy.

Reach Elizabeth Theiss-Morse at 402-472-3221 or

Expert alert: Sarbanes-Oxley and whistleblowing, 10 years later

Thursday, July 26th, 2012

Ten years ago this Sunday, the Sarbanes–Oxley Act of 2002 set new standards for all U.S. public company boards, management and public accounting firms as a reaction to a number of major corporate and accounting scandals.

The act ranges from additional corporate board responsibilities to criminal penalties, and requires the Securities and Exchange Commission to implement rulings on requirements to comply with the law. It created a new agency to oversee, regulate, inspect and discipline accounting firms in their roles as auditors of public companies. The act also covers issues such as auditor independence, corporate governance, internal control assessment, and enhanced financial disclosure.

Debate continues over the perceived benefits and costs of the Act, including on the subject of whistleblowing.

Richard Moberly of the University of Nebraska-Lincoln College of Law, a national expert on whistleblower law, has closely examined and critiqued the Act’s triumphs and failings in that area in a recent paper for the South Carolina Law Review. In it, he argues that the Act didn’t sufficiently protect whistleblowers who suffered retaliation and that despite massive new protections, whistleblowers did not play a significant role in uncovering the 2008 financial crisis. This suggests, he says, that though whistleblowers enjoyed stronger protection than ever before, they had less reason to believe such protection works.

Moberly writes:

Sarbanes-Oxley initiated a decade of impressive growth in the development of formal whistleblower provisions, such as including whistleblower protection in significant federal legislation and mandating the widespread use of codes of ethics and whistleblower hotlines. Despite these successes, Sarbanes-Oxley’s failures may teach the Act’s most significant lesson: that the anti-retaliation and structural whistleblower models, while necessary, do not sufficiently protect and encourage whistleblowers.

The Act failed to protect victims of retaliation adequately and it did not prevent or remedy the underlying misconduct disclosed by whistleblowers. The experience with Sarbanes-Oxley over the last decade teaches that individual players in the system, such as organizational supervisors, government administrators, and adjudicatory decision makers, impact whistleblowers as much as, if not more than, any formal legal provisions, and can undermine the protections they appear to provide.

Moreover, the failure of whistleblowers to prevent the recent financial crisis exposed the limitations of the antiretaliation and structural models.

Perhaps we ought to spend as much effort determining who is involved in whistleblower protection as we do deciding what those protections should formally entail. If new leadership at OSHA and on the ARB can change the approach of those institutions to whistleblower protection, then it would confirm that choosing the right people to lead whistleblower protection efforts could be as important as having the right whistleblower provisions.

Sarbanes-Oxley made possible the evolutionary leap from the Act’s antiretaliation protection and structural encouragement to Dodd-Frank’s bounty payments. If Dodd-Frank permits more effective whistleblowing by addressing the underlying wrongdoing, then its bounty model may come to be seen as an essential part of a comprehensive legislative approach to supplement the conventional use of statutory antiretaliation protection and whistleblower hotlines.

If these changes make a difference in the future, then Sarbanes-Oxley’s failings could demonstrate that policy makers should think more broadly than simply protecting whistleblowers from retaliation and providing a structural disclosure channel and code of ethics. A decade from now, we may look back on Sarbanes-Oxley’s whistleblower provisions with more generous eyes. Rather than focus on their failings, we may view them as important first steps toward a more comprehensive whistleblower strategy.”

To contact Prof. Moberly, call (402) 472-1256 or email him at

Latest drought map: Widespread intensification over central U.S.

Thursday, July 26th, 2012

From the National Drought Mitigation Center at UNL:

The July 24 U.S. Drought Monitor showed widespread intensification of drought through the middle of the country, according to the National Drought Mitigation Center at the University of Nebraska-Lincoln. The map also set a high mark for the fourth straight week for the area in moderate drought or worse in the 12-year history of the U.S. Drought Monitor.

The July 24 map put 53.44 percent of the United States, including Alaska, Hawaii and Puerto Rico, in moderate drought or worse, up from 53.17 percent the week before; 38.11 percent in severe drought or worse, compared with 35.32 a week earlier; 17.2 percent in extreme drought or worse, compared with 11.32 percent the week before; and 1.99 percent in exceptional drought, up from .83 percent the preceding week.

For just the contiguous United States, the map puts 63.86 percent in moderate drought or worse, up from 63.54 percent a week ago; 45.57 percent in severe drought or worse, up from 42.23 a week ago; 20.57 in extreme drought or worse, up from 13.53 percent a week ago; and 2.38 percent in exceptional drought, up from .99 percent a week ago.

“We’ve seen tremendous intensification of drought through Illinois, Iowa, Missouri, Indiana, Arkansas, Kansa and Nebraska, and into part of Wyoming and South Dakota in the last week,” said Brian Fuchs, UNL climatologist and U.S. Drought Monitor author. “The amount of D3 (extreme drought) developing  in the country has increased quite a bit for each of the last several weeks.”

Fuchs also noted that as of the July 24 U.S. Drought Monitor, every state in the country had at least a small area shown as abnormally dry or worse. “It’s such a broad footprint,” he said.

“This drought is two-pronged,” Fuchs said. “Not only the dryness but the heat is playing a big and important role. Even areas that have picked up rain are still suffering because of the heat.”

The forecast for most of the drought-affected area is for drought to continue to develop and intensify. “Conditions are likely to persist,” Fuchs said. “We’ll see further development and intensification into the fall.” Fuchs based his assessment on the Seasonal Drought Outlook released July 19.

The U.S. Drought Monitor map is jointly produced by the National Drought Mitigation Center at the University of Nebraska-Lincoln, the National Oceanic and Atmospheric Administration, the U.S. Department of Agriculture, and about 350 drought observers across the country. It is released each Thursday based on data through the previous Tuesday.

Drought Monitor authors synthesize many drought indicators into a single map that identifies areas of the country that are abnormally dry (D0), in moderate drought (D1), in severe drought (D2), extreme drought (D3) and exceptional drought (D4).

Statistics for the percent area in each category of drought are automatically added to the U.S. Drought Monitor website each week for the entire country and Puerto Rico, for the 48 contiguous states, for each climate region, and for individual states.

The National Climatic Data Center maintains drought data based on the Palmer Drought Severity Index, calculated to the beginning of the historic record.

Contact: Brian Fuchs, NDMC at UNL, (402) 472-6775,

How our brains see men as people and women as body parts

Wednesday, July 25th, 2012

When casting our eyes upon an object, our brains either perceive it in its entirety or as a collection of its parts. Consider, for instance, photo mosaics consisting of hundreds of tiny pictures that when arranged a certain way form a larger overall image: In fact, it takes two separate mental functions to see the mosaic from both perspectives.

A new study suggests that these two distinct cognitive processes also are in play with our basic physical perceptions of men and women – and, importantly, provides clues as to why women are often the targets of sexual objectification.

The research, published in the European Journal of Social Psychology, found in a series of experiments that participants processed images of men and women in very different ways. When presented with images of men, perceivers tended to rely more on “global” cognitive processing, the mental method in which a person is perceived as a whole. Meanwhile, images of women were more often the subject of “local” cognitive processing, or the objectifying perception of something as an assemblage of its various parts.

The study is the first to link such cognitive processes to objectification theory, said Sarah Gervais, assistant professor of psychology at the University of Nebraska-Lincoln and the study’s lead author.

“Local processing underlies the way we think about objects: houses, cars and so on. But global processing should prevent us from that when it comes to people,” Gervais said. “We don’t break people down to their parts – except when it comes to women, which is really striking. Women were perceived in the same ways that objects are viewed.”

In the study, participants were randomly presented with dozens of images of fully clothed, average-looking men and women. Each person was shown from head to knee, standing, with eyes focused on the camera.

After a brief pause, participants then saw two new images on their screen: One was unmodified and contained the original image, while the other was a slightly modified version of the original image that comprised a sexual body part. Participants then quickly indicated which of the two images they had previously seen.

The results were consistent: Women’s sexual body parts were more easily recognized when presented in isolation than when they were presented in the context of their entire bodies. But men’s sexual body parts were recognized better when presented in the context of their entire bodies than they were in isolation.

“We always hear that women are reduced to their sexual body parts; you hear about examples in the media all the time. This research takes it a step further and finds that this perception spills over to everyday women, too,” Gervais said. “The subjects in the study’s images were everyday, ordinary men and women … the fact that people are looking at ordinary men and women and remembering women’s body parts better than their entire bodies was very interesting.”

Also notable is that the gender of participants doing the observing had no effect on the outcome. The participant pool was evenly divided between men and women, who processed each gender’s bodies similarly: Regardless of their gender, perceivers saw men more “globally” and women more “locally.”

“We can’t just pin this on the men. Women are perceiving women this way, too,” Gervais said. “It could be related to different motives. Men might be doing it because they’re interested in potential mates, while women may do it as more of a comparison with themselves. But what we do know is that they’re both doing it.”

Would there be an antidote to a perceiver’s basic cognitive processes that lead women to be reduced and objectified? Researchers said some of the study’s results suggested so. When the experiment was adjusted to create a condition where it was easier for participants to employ “global” processing, the sexual body part recognition bias appeared to be alleviated. Women were more easily recognizable in the context of their whole bodies instead of their various sexual body parts.

Because the research presents the first direct evidence of the basic “global” vs. “local” framework, the authors said it could provide a theoretical path forward for more specific objectification work.

“Our findings suggest people fundamentally process women and men differently, but we are also showing that a very simple manipulation counteracts this effect, and perceivers can be prompted to see women globally, just as they do men,” Gervais said. “Based on these findings, there are several new avenues to explore.”

Contact: Sarah Gervais, assistant professor of psychology, (402) 472-3793 or

Coverage: LiveScience | Yahoo! News | NBC News | Huffington Post |MSN| Medical Daily | Scientific American | ZMEScience | The Atlantic | Forbes | Daily Mail (UK) | Chicago Tribune | CBC News | DigitalJournal | Globe and Mail (Canada) | Jezebel | USA TODAY | io9 | Examiner | BBC Mundo | XOJane | United Press International

Feces fossils lend new insights into link between Natives, diabetes

Tuesday, July 24th, 2012

Antelope Cave

Why do Native Americans experience high rates of diabetes? A common theory is that they possess fat-hoarding “thrifty genes” left over from their ancestors – genes that were required for survival during ancient cycles of feast and famine, but that now contribute to the disease in a modern world of more fatty and sugary diets.

A newly published analysis of fossilized feces from the American Southwest, however, suggests this “thrifty gene” may not have developed because of how often ancient Natives ate. Instead, researchers said, the connection may have come from precisely what they ate.

The research, which appears in the latest edition of the journal Current Anthropology, suggests that the prehistoric hunter-gatherer civilizations of the Southwest lived on a diet very high in fiber, very low in fat and dominated by foods extremely low on the glycemic index, a measure of effects food has on blood sugar levels. This diet, researchers said, could have been sufficient to give rise to the fat-storing “thrifty genes.”

“What we’re saying is we don’t really need to look to feast or famine as a basis for (the genes),” said Karl Reinhard, professor of forensic sciences at the University of Nebraska-Lincoln’s School of Natural Resources and the study’s lead author. “The feast-or-famine scenario long hypothesized to be the pressure for ‘thrifty genes’ isn’t necessary, given the dietary evidence we’ve found.”

Natives have some of the highest rates of Type 2 diabetes of any group and are more than twice as likely to develop the disease as are Caucasians. The notion the gene’s origin goes back to feast-and-famine cycles among prehistoric hunter-gatherer ancestors has been discussed for nearly a half-century.

To fully understand the basis of the high rates, Reinhard said, “one has to look at the best dietary data one can find. That comes from coprolites (the official term for fossilized feces). By looking at coprolites, we’re seeing exactly what people ate.

The coprolites are from Antelope Cave, a deep cavern in northern Arizona where, over several thousands of years, was home to various cultures. That includes the Ancestral Pueblan peoples, who are believed to have lived there seasonally for at least 450 years.

Reinhard and Keith Johnson, an archeologist at California State University, Chico, studied 20 coprolites found in the cave and combined it with analysis from other sites for hints of ancient Natives’ diets. They found clues to a food regimen dominated by maize and high-fiber seed from sunflowers, wild grasses, pigweed and amaranth.

Prickly pear, a desert succulent, was also found repeatedly in the samples. By volume, about three-quarters of the Antelope Cave coprolites were made up of insoluble fiber. The foods also were low on the glycemic index; some research suggests that high-GI foods may increase risk of obesity and diabetes.

The analysts’ findings led them to deduce that the nature of the feast, and not necessarily its frequency, was enough to lock the “thrifty” genes in place – and leave modern Natives more susceptible to diabetes as their diets evolved to lower-fiber, higher-GI foods.

“These were not just famine foods,” the authors wrote. “These were the foods eaten on a day-by-day basis during all seasons in both feast and famine. They continued to be eaten even after agriculture was developed. Antelope Cave coprolites show that this high-fiber diet was eaten during the warmer seasons of food abundance.”

In addition to UNL’s Reinhard and California State Chico’s Johnson, the study was authored by Isabel Teixeira-Santos and Monica Viera of the Escola Nacional de Saude Publica in Rio de Janeiro, Brazil.

Contact: Karl Reinhard, professor of forensic sciences, (402) 875-2863,

Contributing: Kevin Stacey, University of Chicago Press

Coverage: International Business Times | Science Daily |Examiner |LiveScience | Discovery News | Yahoo! News | MSNBC | Huffington Post

Study: IPO firms can shatter ‘growth ceilings’ by investing in people, knowledge

Tuesday, July 17th, 2012

When a company goes public, it’s a sign that the firm has hit its stride. By choosing to be publicly traded, it’s signaling that it’s at the forefront of innovation, that it will use its IPO proceeds to make the next big leap forward, continue growing and make its new shareholders money.


Actually, according to new research, many young companies hit what researchers have identified as an “entrepreneurial growth ceiling” shortly before they go public. The ceiling is the point when a growing firm has collected any number of growth-related problems and issues and needs cash from an IPO to continue moving forward. Smashing against that growth ceiling is often a big reason the companies decide to go public in the first place, the study says.

The newly published research, which analyzed more than 360 new venture firms over time, found that the cash resulting from a company’s IPO can help it break through an “entrepreneurial growth ceiling,” but only if those resources are used in certain, specific ways that can tackle many issues at once, said lead author Theresa Welbourne, University of Nebraska-Lincoln professor of management and director of the Center for Entrepreneurship at UNL’s College of Business Administration.

Breaking through within a year of going public is critical for a firm’s long-term success, the authors wrote.

“Young firms are resource-starved, and an influx of large amounts of cash, as in the case of an IPO, can be a dream turned nightmare for many new ventures because there is little strategic direction of how to spend the cash,” Welbourne and her co-authors wrote. “Without a short-term strategic direction for allocating resources immediately following an IPO, can a long-term competitive advantage ever be achieved?”

Welbourne and her co-authors analyzed the prospectuses of the companies, examined how each one’s IPO proceeds were to be spent and then factored in the companies’ post-IPO performance data.

Their findings? Firms that assigned their IPO proceeds to human resources and innovation – specifically, research and development – broke through their “entrepreneurial growth ceilings” much faster than those that chose to invest in other areas, such as sales and marketing or physical plants and equipment. Those firms’ stocks also performed better over the longer term than other new ventures who invested IPO proceeds elsewhere, the study showed.

Why? The authors said the most strategic and valuable resources that can be gained from IPO proceeds are those that can be used to solve multiple problems at once, problems that become more critical as a company continues to grow.

Issues related to human resources involve managerial shortcomings, employee-related issues and the need to recruit and hire more employees to handle company expansion, according to the study.

By pushing IPO proceeds toward human resources, a company can address problems relating to management capacity, training and development, organization structure, knowledge capacities, compensation and motivation, the authors wrote.

Meanwhile, investing quickly in research and development can support the firm’s need to expand its product line, develop next generation products and improve production processes – which ultimately affect sales, marketing production and retention.

Conventional wisdom and actual practice seem to suggest the opposite of the researchers’ findings, the study says, as layoffs and staff reductions along with decreases in R&D spending made headlines over the last several years.

“In efforts to increase productivity and produce positive earnings to shine for investors and the stock market, perhaps, management is overlooking the new fundamentals of business – building people and knowledge,” the authors wrote.

The study appears in the journal Management Decision. In addition to UNL’s Welbourne, it was authored by Heidi Neck of Babson College and G. Dale Meyer of the University of Colorado Boulder.

Officials: Record amounts of U.S. now experiencing drought

Thursday, July 5th, 2012

More of the United States is in moderate drought or worse than at any other time in the 12-year history of the U.S. Drought Monitor, officials from the National Drought Mitigation Center at the University of Nebraska-Lincoln said today.

Analysis of the latest drought monitor data revealed that 46.84 percent of the nation’s land area is in various stages of drought, up from 42.8 percent a week ago. Previous records were 45.87 percent in drought on Aug. 26, 2003, and 45.64 percent on Sept. 10, 2002.

Looking only at the 48 contiguous states, 55.96 percent of the country’s land area is in moderate drought or worse – also the highest percentage on record, officials said. The previous highs had been 54.79 percent on Aug. 26, 2003, and 54.63 percent on Sept. 10, 2002.

“The recent heat and dryness is catching up with us on a national scale,” said Michael J. Hayes, director of the National Drought Mitigation Center at UNL. “Now, we have a larger section of the country in these lesser categories of drought than we’ve previously experienced in the history of the Drought Monitor.”

The monitor uses a ranking system that begins at D0 (abnormal dryness) and moves through D1 (moderate drought), D2 (severe drought), D3 (extreme drought) and D4 (exceptional drought).

Moderate drought’s telltale signs are some damage to crops and pastures, with streams, reservoirs or wells getting low. At the other end of the scale, exceptional drought includes widespread crop and pasture losses, as well as shortages of water in reservoirs, streams and wells, creating water emergencies. So far, just 8.64 percent of the country is in either extreme or exceptional drought.

“During 2002 and 2003, there were several very significant droughts taking place that had a much greater areal coverage of the more severe and extreme drought categories,” Hayes said. “Right now we are seeing pockets of more severe drought, but it is spread out over different parts of the country.

“It’s early in the season, though. The potential development is something we will be watching.”

The U.S. Drought Monitor is a joint endeavor by the National Drought Mitigation Center at UNL, the National Oceanic and Atmospheric Administration, the U.S. Department of Agriculture and drought observers across the country.

To examine the monitor’s current and archived national, regional and state-by-state drought maps and conditions, go to

Contact: Michael Hayes, director of the National Drought Mitigation Center at UNL, (402) 472-4271 or

UNL physicist Bloom: ‘A great thrill’

Wednesday, July 4th, 2012
Ken Bloom, associate professor of physics and astronomy at UNL, is in Melbourne for the International Conference on High Energy Physics, where details of Wednesday’s announcement about the discovery of a new particle are being presented in more detail.

Bloom is a member of UNL’s experimental high-energy physics team, which has been collaborating in the hunt for the Higgs boson particle since the early 1990s. Read more about UNL’s involvement in the Higgs boson project here.

Prof. Bloom sends us his thoughts this morning:

To be here at the biggest gathering of particle physicists this year was a great thrill. We had hundreds of physicists watching the live broadcast from CERN on what we knew would likely be an historic day for our field.  And our expectations paid off — two completely independent experiments, CMS and ATLAS, had come up with essentially identical results.

We can now say quite firmly that we have discovered a new particle, and, while there is still a lot of work to do to verify this claim, it seems like this very well could be the Higgs boson that we have been anticipating for a half century.

It is especially gratifying to see how important the contributions from our Nebraska team were in this discovery. Our Tier-2 computing center was where many of the simulations of the Higgs-search data samples were carried out, and some of the Higgs candidate events are on our disks in Lincoln.

“We have to thank UNL’s leaders, especially Vice Chancellor Prem Paul, for their help in winning this center for Nebraska. The silicon pixel detector that we have helped to construct and operate has been the workhorse of the CMS particle detector, crucial for identifying the particles that are produced when Higgs bosons decay.  And our postdocs and students have made fabulous contributions to make the CMS experiment really work.  While watching the presentations, I was happy to see how many young people there were in the CERN auditorium; it is their energy and talent that have made this experiment a success.

“But as noted, this is just the start — there is still a tremendous amount of work to do to understand exactly what this new particle is. Is it really the source of the mass of all particles?  Does it actually have all the properties that we expect it to have?  We can’t wait to move into this next phase, and we are looking forward to sharing what we learn with our friends throughout Nebraska.”

Prof. Bloom also has been live-blogging the news for the blog Quantum Diaries today. Check out his entries here.

Expert alert: The “God Particle” announcement

Tuesday, July 3rd, 2012

Physicists from the University of Nebraska-Lincoln who have collaborated in the hunt for the Higgs boson particle will be available to comment on the results set to be announced at CERN laboratory in Switzerland early July 4.

The Higgs boson is the central part of a hypothesis that could peel away the very fabric of the universe and influence the understanding of all matter. It could explain why some particles have mass and others do not.

CERN’s Large Hadron Collider has been generating high-energy collisions of protons in its search for the final piece of the theoretical framework, which is known as the Standard Model of particles and forces. Without the Higgs boson, however, the Standard Model can’t explain how most of these particles acquire their mass, a key ingredient in the formation of the universe.

For the past several years, nearly 1,700 scientists from around the world have been narrowing the search for the Higgs boson with the help of the LHC. About 15 UNL researchers and graduate students have been involved in this and other Higgs-hunting projects.

Starting at 2 a.m. CDT July 4, scientists will give statements at CERN about the latest development. After those conclude, UNL physicists will be available to comment:

— Gregory Snow, professor of physics and astronomy and associate dean for research in UNL’s College of Arts and Sciences. The founding member of UNL’s experimental high-energy physics group, Snow has aided in the search for the Higgs boson since the early 1990s. He has collaborated on UNL’s development of Large Hadron Collider detectors used in the Higgs search. Snow, who is currently in Lincoln, can be reached via email at or by request on July 4. Contact Kelly Bartling, manager of news, (402) 366-4271 or Steve Smith, national news editor, at (402) 217-2774 after 7 a.m. CDT on July 4. Here’s a video featuring Snow from August 2011 about UNL’s role in the experiment.

– Aaron Dominguez, associate professor of physics and astronomy. Dominguez is a member of UNL’s experimental high-energy physics team, which played an important role in building the LHC detectors and analyzing the data that comes from the experiments. He can be reached at or via telephone by request after 7 a.m. CDT on July 4. Here’s a 2011 video of Dominguez discussing his work with the CMS experiment and the hunt for the Higgs boson. Dominguez is at CERN with fellow team member Ilya Kravchenko of UNL.

– Ken Bloom, associate professor of physics and astronomy. Bloom is at the International Conference on High Energy Physics in Melbourne, Australia, where CERN researchers also plan to share their latest findings. Follow his live-blogging the ICHEP seminar on the Higgs boson developments at the physics blog Quantum Diaries at

– Dan Claes, chairman of the department of physics and astronomy. Claes, a member of the high-energy physics team, has collaborated in research and data analysis in the search for the Higgs boson in both in the U.S. and at CERN. He is in Lincoln on July 4 and can be reached by phone via request or via email at

More on UNL’s experimental high-energy physics team can be found here.

Updated 7:17 a.m.: Here is UNL’s official news release regarding the university’s involvement in the search for the Higgs boson particle.