Archive for the Breaking News Category

Feb 29 2016

Fish and Pregnancy: Mercury Exposure Outweighed by Beneficial Effects

— Posted with permission of SEAFOODNEWS.COM. Please do not republish without their permission. —

Copyright © 2016 Seafoodnews.com

Seafood News


 

[Note: Although this article is just concerned with measurement, not causes, its findings support the hypothesis that selenium, which is also found in high concentrations in seafood, acts to prevent mercury from having toxic effects at low levels. The hypothesis is that selenium binds with mercury, making the mercury more inert in any biological process. This idea has been put forward to explain longitudinal studies, such as in the Seychelles, that show populations with extremely high levels of fish consumption, and therefore higher exposure to mercury, show no mercury toxicity effects. -JS]

Among its myriad of health benefits, fish contains nutrients that are important for developing fetuses, which is why pregnant women are advised to eat two or three servings of fish each week. However, concerns over the detrimental effects of mercury – found in nearly all fish – have given pregnant women a reason to be cautious. Now, a new study suggests the negative effects of ingesting low levels of mercury through fish are outweighed by the beneficial effects for newborns.

The study, led by Kim Yolton, PhD, from the Cincinnati Children’s Hospital Medical Center in Ohio, is published in the journal Neurotoxicology and Teratology.

According to the researchers, previous studies examining the effect of low-level gestational mercury exposure from fish intake on neurobehavioral outcomes of newborns have been limited.

As such, they conducted an in-depth study involving 344 infants at 5 weeks of age using the NICU Network Neurobehavioral Scale (NNNS).

The researchers measured gestational mercury exposure through maternal blood and infant umbilical cord blood. They also collected information on maternal fish intake and estimated consumption of polyunsaturated fatty acid based on type and amount of fish the pregnant women ate.

In total, 84% of the mothers reported eating fish during pregnancy, but they only averaged about 2 ounces per week.

In 2014, both the Food and Drug Administration (FDA) and the Environmental Protection Agency (EPA) revised their advice to pregnant women regarding fish consumption; they advise consuming 8-12 ounces per week, as well as selecting fish with the lowest levels of mercury.

Fish with low mercury levels include salmon, shrimp, pollock, light canned tuna, tilapia, catfish and cod, whereas high-mercury fish include tilefish, shark, swordfish and mackerel.

Fish consumption counteracted neurotoxic effects of mercury

According to the World Health Organization (WHO), mercury may have toxic effects on the nervous, digestive and immune systems, and also on the lungs, kidneys, skin and eyes.

It is on the WHO’s list of top 10 chemicals that are of major public health concern.

However, results from the latest study yielded little evidence of harm in newborns whose mothers consumed low amounts of fish and who had low exposure to mercury.

Interestingly, the infants whose mothers had higher mercury exposure during pregnancy and who also consumed more fish displayed better attention and required less special handling.

The researchers say this is likely due to the positive nutritional effects of consuming fish.

Although infants with higher prenatal mercury exposure showed asymmetric reflexes, after the researchers took fish consumption into account, they found that the infants whose mothers consumed more fish displayed better attention.

Commenting on their findings, Yolton says:

“The better neurobehavioral performance observed in infants with higher mercury biomarkers should not be interpreted as a beneficial effect of mercury exposure, which is clearly neurotoxic.

It likely reflects the benefits of polyunsaturated fatty acid intake that also comes from fish and has been shown to benefit attention, memory and other areas of development in children.”

Most people do not eat recommended two to three servings per week

According to the FDA, nearly all fish contain at least traces of mercury because as they feed, they absorb it. Mercury typically builds up more in certain types of fish, particularly in larger fish with longer life spans.

Although fish confers health benefits for the general public, many people do not currently eat the recommended amount of fish, which is two to three servings per week.

“The important thing for women to remember is that fish offers excellent nutritional qualities that can benefit a developing baby or young child,” says Yolton. “Moms just need to be thoughtful about which fish they eat or provide to their child.”

She adds that in their study, mercury exposure was low – likely due to the mothers consuming fish low in mercury – “so the detrimental effects might have been outweighed by the beneficial effects of fish nutrition.”


Subscribe to SEAFOODNEWS.COM

Feb 23 2016

Seas Are Rising at Fastest Rate in Last 28 Centuries

 Juan Carlos Sanchez paddled a kayak with his shoes on a flooded street in Miami Beach last year. Credit Lynne Sladky/Associated Press



The worsening of tidal flooding in American coastal communities is largely a consequence of greenhouse gases from human activity, and the problem will grow far worse in coming decades, scientists reported Monday.

Those emissions, primarily from the burning of fossil fuels, are causing the ocean to rise at the fastest rate since at least the founding of ancient Rome, the scientists said. They added that in the absence of human emissions, the ocean surface would be rising less rapidly and might even be falling.

The increasingly routine tidal flooding is making life miserable in places like Miami Beach; Charleston, S.C.; and Norfolk, Va., even on sunny days.

Though these types of floods often produce only a foot or two of standing saltwater, they are straining life in many towns by killing lawns and trees, blocking neighborhood streets and clogging storm drains, polluting supplies of freshwater and sometimes stranding entire island communities for hours by overtopping the roads that tie them to the mainland.

Such events are just an early harbinger of the coming damage, the new research suggests.

“I think we need a new way to think about most coastal flooding,” said Benjamin H. Strauss, the primary author of one of two related studies released on Monday. “It’s not the tide. It’s not the wind. It’s us. That’s true for most of the coastal floods we now experience.”

In the second study, scientists reconstructed the level of the sea over time and confirmed that it is most likely rising faster than at any point in 28 centuries, with the rate of increase growing sharply over the past century — largely, they found, because of the warming that scientists have said is almost certainly caused by human emissions.

They also confirmed previous forecasts that if emissions were to continue at a high rate over the next few decades, the ocean could rise as much as three or four feet by 2100.

Experts say the situation would then grow far worse in the 22nd century and beyond, likely requiring the abandonment of many coastal cities.

The findings are yet another indication that the stable climate in which human civilization has flourished for thousands of years, with a largely predictable ocean permitting the growth of great coastal cities, is coming to an end.

“I think we can definitely be confident that sea-level rise is going to continue to accelerate if there’s further warming, which inevitably there will be,” said Stefan Rahmstorf, a professor of ocean physics at the Potsdam Institute for Climate Impact Research, in Germany, and co-author of one of the papers, published online Monday by an American journal, Proceedings of the National Academy of Sciences.

In a report issued to accompany that scientific paper, a climate research and communications organization in Princeton, N.J., Climate Central, used the new findings to calculate that roughly three-quarters of the tidal flood days now occurring in towns along the East Coast would not be happening in the absence of the rise in the sea level caused by human emissions.

More Reporting on Climate Change

The lead author of that report, Dr. Strauss, said the same was likely true on a global scale, in any coastal community that has had an increase of saltwater flooding in recent decades.

The rise in the sea level contributes only in a limited degree to the huge, disastrous storm surges accompanying hurricanes like Katrina and Sandy. Proportionally, it has a bigger effect on the nuisance floods that can accompany what are known as king tides.

The change in frequency of those tides is striking. For instance, in the decade from 1955 to 1964 at Annapolis, Md., an instrument called a tide gauge measured 32 days of flooding; in the decade from 2005 to 2014, that jumped to 394 days.

Flood days in Charleston jumped from 34 in the earlier decade to 219 in the more recent, and in Key West, Fla., the figure jumped from no flood days in the earlier decade to 32 in the more recent.

A motorist driving through seawater in Charleston, S.C., last year. In the decade from 1955 to 1964, Charleston registered 34 days with flooding; in the decade from 2005 to 2014, the number jumped to 219. Credit Stephen B. Morton/Associated Press

The new research was led by Robert E. Kopp, an earth scientist at Rutgers University who has won respect from his colleagues by bringing elaborate statistical techniques to bear on longstanding problems, like understanding the history of the global sea level.

Based on extensive geological evidence, scientists already knew that the sea level rose drastically at the end of the last ice age, by almost 400 feet, causing shorelines to retreat up to a hundred miles in places. They also knew that the sea level had basically stabilized, like the rest of the climate, over the past several thousand years, the period when human civilization arose.

But there were small variations of climate and sea level over that period, and the new paper is the most exhaustive attempt yet to clarify them.

The paper shows the ocean to be extremely sensitive to small fluctuations in the Earth’s temperature. The researchers found that when the average global temperature fell by a third of a degree Fahrenheit in the Middle Ages, for instance, the surface of the ocean dropped by about three inches in 400 years. When the climate warmed slightly, that trend reversed.

“Physics tells us that sea-level change and temperature change should go hand-in-hand,” Dr. Kopp said. “This new geological record confirms it.”

In the 19th century, as the Industrial Revolution took hold, the ocean began to rise briskly, climbing about eight inches since 1880. That sounds small, but it has caused extensive erosion worldwide, costing billions.

Due largely to human emissions, global temperatures have jumped about 1.8 degrees Fahrenheit since the 19th century. The sea is rising at what appears to be an accelerating pace, lately reaching a rate of about a foot per century.

One of the authors of the new paper, Dr. Rahmstorf, had previously published estimates suggesting the sea could rise as much as five or six feet by 2100. But with the improved calculations from the new paper, his latest upper estimate is three to four feet.

That means Dr. Rahmstorf’s forecast is now more consistent with calculations issued in 2013 by the Intergovernmental Panel on Climate Change, a United Nations body that periodically reviews and summarizes climate research. That body found that continued high emissions might produce a rise in the sea of 1.7 to 3.2 feet over the 21st century.

In an interview, Dr. Rahmstorf said the rise would eventually reach five feet and far more — the only question was how long it would take. Scientists say the recent climate agreement negotiated in Paris is not remotely ambitious enough to forestall a significant melting of Greenland and Antarctica, though if fully implemented, it may slow the pace somewhat.

“Ice simply melts faster when the temperatures get higher,” Dr. Rahmstorf said. “That’s just basic physics.”

How Much Warmer Was Your City in 2015?


Read the original post: http://www.nytimes.com/

Feb 18 2016

Study: Fish Prevents Alzheimer’s; Don’t Sweat Mercury Levels

— Posted with permission of SEAFOODNEWS.COM. Please do not republish without their permission. —

Copyright © 2016 Seafoodnews.com

Seafood News


SEAFOODNEWS.COM [Atlanta Journal] By Larry Clifton  February 17, 2016

Atlanta – A new study challenges the dietary populism that in the past suggested the consumption of fish poses more of a health risk to the brain than benefit.

According the study, there is not enough toxic mercury to damage our brains in the weekly consumption of seafood. Conversely, the new findings show that the omega-3 fatty acids found in fish had a role in defending our brains against Alzheimer’s and other forms of dementia. Previous studies questioned whether increased mercury levels in the brain would cancel out such benefits, an issue specifically addressed in the new study, published by CNN.

Researchers questioned the study group about their diet every year starting in 1997. Furthermore, they performed brain autopsies on 286 participants who died between 2004 and 2013 to examine the levels of mercury and to determine whether there was neurological damage associated with dementia.

“The findings were very striking,” said Martha Clare Morris, director of nutrition and nutritional epidemiology at Rush University Medical Center.

“Our hypothesis was that seafood consumption would be associated with less neuropathology, but that if there were higher levels of mercury in the brain, that would work against that. But we didn’t find that at all,” said Morris, who is lead author of the study, which was published Tuesday in the Journal of the American Medical Association.

The research only added one caveat; they only observed the benefit among participants who had a strong genetic risk factor for Alzheimer’s. These participants carried a version of the APOE gene called APOE-4, which is associated with higher risk of developing Alzheimer’s. Nevertheless, unless one has been specifically tested for the APOE-4 gene, consuming more fish is decidedly healthy and, according to the study, does not result in the accumulation of toxic amounts of mercury.

Even people who test negative for the APOE-4 gene likely gain some lesser amount of protection from Alzheimer’s by a weekly consumption of seafood, but the current study was not able to confirm or deny that question, Morris said.

“One theory is that seafood consumption may be more beneficial in older age because, as we age, we lose DHA in the brain,” a molecule that is important to maintain brain health, Morris said. DHA is one of the main fatty acids that can be obtained from fish. People with APOE-4 are thought to lose even more DHA in the brain, so seafood consumption could be even more beneficial to them, Morris added.

Still, Morris maintained that individuals sustaining a steady diet of certain kinds of seafood could experience a downside to brain health. “Our findings can’t be generalized to people who are really high consumers of seafood,” Morris said. In the Midwest population in the study, very few ate seafood every day.

The latest study augments findings of previous surveys as well as the professional opinions of doctors who treat patients and study Alzheimer’s disease.

“The evidence is quite clear that people who consume healthier forms of fish [which are baked or broiled rather than fried] are going to end up with healthier brains,” said James T. Becker, professor of psychiatry and associate director of the Alzheimer’s Disease Research Center at the University of Pittsburgh, who was not involved in the current study.

As for whether mercury increases the risk of dementia, “I personally don’t think there’s evidence for it. I think these heavy metals are going to do other things first,” such as causing nerve pain, itching or burning, Becker said.


Subscribe to SEAFOODNEWS.COM

Feb 12 2016

Where is promised ‘Godzilla’ El Nino? Update says slimmer monster lying in wait

On the left, a satellite image from Jan. 23, 2016 of El Nino, vs. the latest image, on the right, of Feb. 4, 2016.

Southern Californians using newly purchased umbrellas as parasols and wearing sunscreen instead of rain slickers have been asking: Where is El Niño?

The answer from scientists, climatologists and weather forecasters is: Right here.

“No, it hasn’t gone away. It is still as strong as it ever was,” said Ken Clark, meteorologist with Accuweather.com in Southern California.

Huh?

With sunny skies and a week’s worth of summer-like weather spiking the mercury into the high 80s, how can that be true? Isn’t El Niño supposed to bring wet, stormy weather?

Be patient, says El Niño expert and climatologist Bill Patzert of NASA’s Jet Propulsion Laboratory in La Cañada Flintridge. It’s coming.

“A month from now you’ll be writing about the March Miracle or the April Apocalypse,” he said, responding to what he calls the media’s unquenchable thirst for colossal storms and massive mudslides, neither of which has happened as predicted for Southern California.

Even Josh Willis says that local rains are coming. He’s the JPL project scientist for oceanography satellite Jason-3, the newest sea-temperature and sea-level reading tool used by climatologists to identify the current El Niño as the largest ever.

“Don’t throw out that umbrella just yet,” he said Tuesday.

All three scientists say El Niño will perform, but its arrival into Southern California has been delayed. They’re expecting a conveyor belt of squalls to enter stage left in late February and continue through March, possibly into April. This is a month or so later than original predictions for heavy rains.

Patzert says sometimes El Niños take their sweet time.

“It is not unusual for El Niños, with regards to Southern California rain, to be slow starters,” Patzert explained. “When they hook up, they are fast and furious finishers.”

Also, in cases of El Niño, size matters. This one is too big — about 21/2 times the size of the continental United States — and is having trouble maneuvering. But new satellite data from Feb. 4 show the El Niño has shrunk nearly 40 percent since the last capture on Jan. 23, Patzert said. The El Niño has receded east of Hawaii, whereas last month’s image showed it west of the islands, he said.

“As this signal shrinks, the jet stream should pull farther south” tracking storms into Southern California, he said.

The National Weather Service’s Climate Prediction Center in College Park, Maryland, released its monthly El Niño forecast on Thursday. It predicts a 50 percent to 60 percent probability of above-average rainfall in Southern California for March and April, with only 40 percent to 50 percent for Central California and 30 percent to 40 percent for Northern California, said John Gottschalck, chief of the center’s operational prediction branch.

In short, it reiterated a prediction of El Niño-patterned weather for Southern California. Namely, weakened westward-blowing trade winds and a warming of the upper ocean in the central and eastern Pacific will alter the jet stream, pulling storms into the West Coast, much like what happened during the last El Niño in February and March 1998.

Another reason for El Niño’s slow start is its entanglement with a high-pressure system over Utah that brought dry, hot weather into Southern California over the past week and a half. More importantly, the high-pressure dome pushed the jet stream north, sending El Niño-fueled storms into Central and Northern California in January and the Pacific Northwest and western Canada in February. This has increased Sierra snowpack to 105 percent, a positive sign for breaking the drought.

Calling it a “short-term” phenomena, Clark says the high pressure has begun to fade, with the National Weather Service predicting “a slight chance” or rain on Thursday. However, serious rain is not expected until the end of the month. “There may be more stormy patterns as we get into the end of February and into March,” he said.

Of course, no one can say for sure what El Niño will do. But scientists are hardly reprimanding El Niño, Spanish for “the child.” They are giving it a second chance.

“The ball game is not over yet. We do have a lot of innings left in the game,” Gottschalck said. “ We still have March.”


Read the original post: http://www.dailybreeze.com/

Feb 8 2016

Federal disaster loans offered to commercial Dungeness crab fishermen

marinij

Monterey – Help is on the way in the form of federal disaster loans for commercial fishermen who have suffered financial losses from California’s canceled commercial Dungeness crab season this year.

That’s according to the U.S. Small Business Administration (SBA), which announced this week that it is offering low-interest federal disaster loans to small business commercial fishermen. With a “disaster” declaration made by Gov. Jerry Brown, fishermen can receive immediate access to the loans of up to $2 million at an interest rate of 4 percent (2.625 percent for private, non-profit organizations) with terms of up to 30 years. Those eligible include any small business owner or worker engaged in crab fishing in the waters affected by the delay of the season itself or by the closure of Rock Crab Fishery. That includes the suppliers of fishing gear and fuel, docks and boatyards, processors, wholesalers, shippers and retailers. The declaration covers 39 counties in California including Monterey and Santa Cruz counties, and two counties in Nevada and two counties in Oregon.

“If anybody feels like they’re impacted in any way, shape or form, than they should apply,” said Susheel Kumar, Public Information Officer for the Small Business Association.

It was in November that state officials closed the Dungeness crab fishing season after finding unsafe levels of a toxin called domoic acid, caused by a massive coastal algae bloom fueled by El Nino.

The season was originally set to kick off on Nov. 15.

“We run crab combos from November until April where we put the pots out in the water and go pull out the crab pots in the afternoon,” said Chris Arcoleo of Chris’s Fishing Trips and Whale Watching in Monterey. “So without having the crabs to make the trip worthwhile for those people, it really affects you. And once you get into January and February, all we can fish for is flat fish, and it affects the amount of fish you can take trips out for.”

The closed crab fishing season has also impacted Phil’s Fish Market in Moss Landing, according to owner Phil DiGirolama.

“We didn’t have any local crab so we’re bringing crab up from Oregon and the price reflects that,” said DiGirolama.

But for Mike Ricketts of the Monterey Commercial Fishing Association who has made his living from crab and salmon fishing for the last 40 years, it has been a combination of the recent bad salmon season and now crab season that has really hit him and other fishermen hard.

“No one saw it coming so nobody could prepare for it,” said Ricketts. “Fishermen had already spent a considerable amount of money without having any way to repay it.”

Ricketts said that probably 80 percent of income from crab fishing has gone by.

“Most guys are living off their credit cards,” he said. “And now the further we go into the season, the less demand there is for crab.”

Ricketts is also leery of the SBA loans.

“That doesn’t help a lot of people because of the collateral they want,” he said. “If you own a boat and you can’t pay it back, they’ll want your boat and that scares a lot of fishermen. If we don’t get to go fishing, there are no ways for these to be repaid.”

But Kumar said that help is there if needed and those interested can learn more at two different information sessions in the Monterey area, one at the Moss Landing Harbor District at 7881 Sandholdt Road on Wednesday and the other at the Monterey Harbor Office at 250 Figueroa St. on Friday, Feb. 12. Applicants can also go online at https://disasterloan.sba.gov/ela.

Last year, California crab fishermen caught 16.8 million pounds of Dungeness, worth $58.3 million. That was considered a “strong” season, according to the 2014 Dungeness Crab Report put out by the Pacific States Marine Fisheries Commission.

This year, the commercial season is scheduled to end June 30.

Carly Mayberry can be reached at 726-4363.


Read the original post: http://www.marinij.com/

Feb 1 2016

Harmful Algal Blooms: A Sign of Things to Come?

Pseudo-nitzschia, a marine algae that produces a toxin called demoic acid. Excess production of pseudo-nitzchia can result in a harmful algal bloom such as the one that shut down shellfish fisheries along the U.S. West Coast last year. Photo: NOAA Fisheries.

Image of scientists deploying robot that monitors the water for harmful algal blooms. Scientists from NOAA Fisheries and the University of Washington Applied Physics Lab have developed the Environmental Sample Processor, a robot that monitors the water for harmful algal blooms. This one is being deployed in Puget Sound, just north of Seattle. Credit: NOAA Fisheries/Stephanie Moore.

Last year, a vast mass of poisonous algae bloomed off the West Coast, from California to Alaska. One of the largest of its kind ever recorded in the region, the bloom shut down shellfish fisheries all along the coast, impacted the livelihoods of fishermen, and threatened the health of many marine mammals.

Harmful algal blooms happen when a species of algae that produces toxins grows out of control. Although last year’s bloom has begun to subside, and Dungeness crab and other valuable fisheries have begun to re-open, climate change is still warming the ocean. Warmer water means faster-growing algae, and if climate projections are correct, it’s likely that we’ll see more of these blooms in the future.

Vera Trainer is an oceanographer with NOAA’s Northwest Fisheries Science Center (NWFSC) in Seattle. In this podcast, Trainer describes some of the measures that NOAA Fisheries and other agencies are taking to help coastal communities adapt to a changing future.

Transcript: Harmful Algal Blooms, A Sign of Things to Come?


Listen to the podcast: http://www.nmfs.noaa.gov/podcasts/2016/01/habs_with_vera_trainer.html

Feb 1 2016

Fish toxins at lowest levels in decades

Fishermen unload their catch on April 16, 2015 in the Baja California town of San Felipe. / photo by Misael Virgen * U-T

Fishermen unload their catch on April 16, 2015 in the Baja California town of San Felipe. / photo by Misael Virgen * U-T

Fish in today’s oceans contain far lower levels of mercury, DDT and other toxins than at any time in the past four decades, according to a major review by scientists at the Scripps Institution of Oceanography in La Jolla.

In what’s billed as the first analysis of its kind, the researchers looked at nearly 2,700 studies of pollutants found in fish samples taken from all over the world between 1969 and 2012. They saw steady, significant drops in the concentrations of a wide range of contaminants known to accumulate in fish — from about 50 percent for mercury to more than 90 percent for polychlorinated biphenyls, or PCBs.

At high enough concentrations, these toxins can cause cancer, neurological disorders, birth defects, thyroid problems and other ailments in people who consume tainted fish.

Echoing what marine experts have said over the years, the Scripps scientists conclude that clean-water regulations, lawsuits and other forms of public pressure have led to bans or sharp reductions in the use of industrial and agricultural contaminants that end up in creeks, rivers and oceans.

But they also tempered the good news with a sobering reminder: Many fish in the wild still have pollutants at levels considered unsafe for frequent human consumption.

And forget the longstanding belief that smaller fish — those lower on the food chain — generally contain fewer toxins than large predator fish, or that fish caught in certain areas of the world are safer than those harvested in other zones.

“Maybe there’s a little pattern of those predators through bioaccumulation having more [contamination], but it wasn’t a common pattern. So it doesn’t mean you’re safe as long as you eat a halibut,” said Stuart Sandin, a marine biologist at Scripps and co-author of the new analysis.

“There’s just a lot more complexity out there that washes away that very clean and easy signal of, ‘Eat something that’s not a predator and you’ll be OK,’” he added.

The new report found no predictable pattern of contamination.

Some mackerel and sardines had far greater pollution than some swordfish and shark. In addition, oceanographers have come to realize that an unexpectedly high number of fish species migrate for thousands of miles. Even within a school of fish caught in the same location, the degree of toxins can vary greatly from one individual fish to another.

To further complicate matters, today’s increasingly international seafood market means that U.S. stores typically sell fish shipped from Southeast Asia, South America and elsewhere.

“Some fish [surprisingly] showed low concentrations and some showed very high concentrations” of toxins, Sandin said. “So it’s a little bit of a gamble.”

Rita Kampalath, science and policy director for the environmental group Heal the Bay, said while the decrease in contaminant levels is welcomed news, it isn’t particularly surprising. “It’s great that that’s documented. I would hope that they would drop since many of these contaminants haven’t been produced for a long time,” she said.

The report, published Thursday in the journal PeerJ, focused on five contaminants or classes of contaminants that are widespread in the oceans: chlordane, DDT, mercury, PCBs and polybrominated diphenyl ethers, or PBDEs.

Sandin and his colleagues didn’t try to assess the health effects of these chemicals because that field of science has long been established. Instead, they sought to identify geographic and other patterns in the chemical contamination of fish worldwide.

Their analysis showed that on average, pollutant concentrations now meet federal safety guidelines in the United States for occasional fish consumption — two or three servings per week, based on various marine groups’ standards. For example, mercury and PCBs were found at levels acceptable for occasional human consumption and DDT was consistently below the established threshold for concern.

Still, with millions of people worldwide relying on fish as their main source of protein, the threat of contamination continues to loom large.

“We label a lot of things like whether it’s sustainable or whether it’s wild-caught or farmed, but one of the things that we still haven’t figured out how to do is to address whether it’s contaminated or not. I think that’s something most people want to know,” said Amro Hamdoun, a biology professor at Scripps and co-author of the new report.

Attempts to track contamination by species, geographic region and other factors revealed a complex set of circumstances that researchers have yet to fully understand.

“The pollution does not stay in one place,” Sandin said. “We thought that we would find something like that. But when you start increasing the scale, that predictability goes away. We didn’t find any evidence that when you’re near shore versus 100 miles off shore that you had more or less chemicals in the seafood.”

Currently, chemical testing for seafood is limited and expensive.

“One of the things that we’re working on right now in the lab is new technologies that could be used to rapidly screen fish and other types of food for these different types of contaminants,” Hamdoun said.

He and his Scripps colleagues also are collecting data about yellowfin tuna from around the world, in part to develop techniques for studying how various contaminants interact with one another. They said such testing is needed because one specimen of fish often contains several types of pollutants.

“For a consumer, if you’re eating something that has a high chemical concentration of one you’re getting a bunch, you’re getting a cocktail of these chemicals,” Sandin said. “And we’ve yet to explore the health consequences of mixing these chemicals.”

Kampalath of Heal the Bay agreed with the Scripps team that further research is needed.

“I think the authors acknowledged there are confounding factors that would make the conclusions less clean,” Kampalath said. “The coarseness of the scale presents a challenge in identifying patterns in [pollutant] concentration levels.”


 

CONSUMPTION GUIDELINES

• Some recommendations from California’s Office of Environmental Health Hazard Assessment:

• In the same location, some fish species can have higher chemical levels than others. If possible, eat smaller amounts of several types of fish rather than a large amount of one type.

• Eat only the fillet portions. Don’t eat the guts and liver because chemicals usually concentrate in those parts. Also, avoid frequent consumption of any reproductive parts such as eggs and roe.

• Many contaminants are stored in fish fat, so skin the fish when possible and trim any visible fat.

• Use a cooking method that allows juices to drain away from the fish — baking, broiling, grilling or steaming. The juices contain chemicals from fish fat and should be thrown away. If you make stews or chowders, use fillet parts.

• Raw fish may be infested by parasites, so cook fish thoroughly to destroy them. This also helps to reduce the level of many contaminants.

• Pregnant women should talk with their doctors about fish-consumption warnings given by the U.S. Food and Drug Administration and the U.S. Environmental Protection Agency.

 


Read the original post: http://www.sandiegouniontribune.com/

Feb 1 2016

Fishing Not to Blame for Sea Lion Deaths

Complex, proactive management efforts have been in place for decades to prevent overfishing in California, efforts are working.

Buellton, California (PRWEB) February 01, 2016

Recent media accounts report how sea lion pups are stranding on shore and dying in record numbers because there aren’t enough sardines and anchovies – sea lions’ favorite food – to support their growing population. For example, the Orange County Register, even after acknowledging the cyclical nature of the ocean and marine species, points to overfishing as the problem. (see story here).

“While it seems all too common these days to blame the ocean’s woes on overfishing, the truth is far different in California. Fortunately, we do have an accurate picture. It’s a graph that shows natural sardine booms and busts for the past 1,400 years,” said Diane Pleschner-Steele, executive director for the California Wetfish Producers Association. “Oceanic core samples were extracted from an anaerobic trench in the Santa Barbara Channel and study findings were reported by Dr. Tim Baumgartner and others in a 1992 California Cooperative Fisheries Investigations (CalCOFI) report. The study correlated alternating periods of sardine and anchovy population recruitments and collapses related to warm and cold water oceanic cycles. Sardines tend to favor warm water cycles while anchovy favor cold.

“It’s important to note that most collapses in this timeframe occurred when there was virtually no commercial fishing. The great fluctuations experienced by sardines and anchovies have been known for a long time to be part of a natural cycle,” said Pleschner-Steele.

Fishery scientists can confirm that the recent sardine decline was not the result of overfishing. The less than 8,000 tons of sardine harvested in California in 2014 were still below the federally mandated overfishing limit.

When sardines began returning to abundance in the 1980s, they became perhaps the best-managed fishery in the world – the poster fish for effective ecosystem-based management. The current harvest control rule – established 17 years ago and updated in 2014 with more precautionary science – sets a strict harvest guideline every year that considers ocean conditions and automatically reduces the catch limit as the biomass declines.

Since federal management began in 2000, the sardine biomass estimate has declined more than 70 percent from the 2006 high of 1.3 million mt, and harvest limits have fallen from 152,564 mt in 2007 to a U.S. catch target of 23,293 mt in 2014 – an 85 percent decline. In 2015, the estimated sardine biomass fell below the established “cutoff,” the biomass level above which sardine fishing is allowed, and the directed fishery was closed. This is a perfect example of our fishery management at work to prevent overfishing.

Compare this to the 1940s and ’50s when the fishery harvest averaged 43 percent or more of the standing sardine stock with little regulatory oversight and no limit on the annual catch. The historic sardine fishery collapse devastated Monterey’s Cannery Row.

But that was nearly 70 years ago. The current sardine fishery harvest is less than a quarter of the rate observed during the historical sardine collapse. The current harvest regulations leave close to 90 percent of sardines in the ocean as forage for marine life.

As for anchovy, harvests have actually averaged less than 8,000 tons annually over the last 14 years. Reports often cite a new study alleging a recent anchovy “collapse,” but the study period stopped in 2011 and excluded nearshore egg and larval data where young anchovies always live and spawn. In fact, field surveys in 2015 recorded recruitment of sardine and anchovy larvae and juveniles as “the highest ever” in Central California and relatively high in Southern California. Fishermen in both Monterey and Southern California attest to the abundance of anchovy and sardine along the coast.

“Complex, proactive management efforts have been in place for decades to prevent overfishing in California, efforts are working,” said Pleschner-Steele.

RevSardineDEPOSITION

About the California Wetfish Producers Association
The California Wetfish Producers Association is a nonprofit dedicated to research and to promote sustainable Wetfish resources. More info at http://www.californiawetfish.org.


Read the original post: http://www.prweb.com/

Jan 29 2016

U.S. Fisheries Management Clears High Bar for Sustainability Based on New Assessment

saving-seafood-logo

January 28, 2016 — The following was released by NOAA Fisheries:

Today, NOAA Fisheries announced the publication of a peer-reviewed self-assessment that shows the standards of the United States fishery management system under the Magnuson-Stevens Act more than meet the criteria of the United Nation’s Food and Agriculture Organization’s ecolabelling guidelines. These same guidelines serve as a basis for many consumer seafood certification and ranking schemes. The assessment demonstrates that the U.S. fisheries management system is particularly strong when considering responsiveness and science-based criteria. Beyond the biological and ecosystem criteria, the assessment also pointed out that the U.S. system incorporates the social and economic components of fisheries essential for effective long-term stewardship.

This assessment was authored by Dr. Michelle Walsh, a former NOAA Fisheries Knauss Fellow and current member of the Marine Science Faculty at Florida Keys Community College. Walsh evaluated the sustainability of how U.S. federal fisheries are managed using the FAO’s Guidelines for the Ecolabelling of Fish and Fishery Products from Marine Capture Fisheries. These guidelines are a set of internationally recognized criteria used to evaluate the sustainability of fisheries around the world.

“While the performance of U.S. fisheries clearly illustrates that the U.S. management system is effective, my colleagues and I wanted to evaluate the U.S. approach to fisheries management as a whole against these international guidelines for ecolabelling seafood,” said Walsh.

Walsh found that the U.S. federal fisheries management system meets all of the FAO guidelines for sustainability. In particular, the assessment highlighted some key strengths of the U.S. system (represented by white/green dots on infographic) including:

  • Complying with national and international laws
  • Developing and abiding by documented management approaches with frameworks at national or regional levels
  • Incorporating uncertainty into stock reference points and catch limits while taking actions if those limits are exceeded
  • Taking into account the best scientific evidence in determining suitable conservation and management measures with the goal of long-term sustainability
  • Restoring stocks within reasonable timeframes

Evaluating Sustainability

“Sustainability” is about meeting the needs and wants of current generations without compromising those of future generations (WCED, 1987; United Nations, 1987). However, evaluating sustainability can become considerably more complex in the context of wild-caught fisheries in the dynamic ocean environment, where population trends and environmental conditions are often unclear or unknown.

Due to this complexity, many certification schemes assess sustainability on a fishery-by-fishery basis by evaluating discrete management approaches (such as gear type) and current stock status at a snapshot in time. This assessment, on the other hand, evaluates the U.S. management system as a whole against the FAO guidelines for ecolabels. It evaluates the capacity of the management system to respond to changes in stock levels and adapt to changing conditions via management measures that maintain sustainability over the long-term.

Jan 23 2016

CFOOD: DO “CATCH RECONSTRUCTIONS” REALLY IMPLICATE OVERFISHING?

saving-seafood-logo

January 22, 2016—The following is commentary from Michel J. Kaiser of Bangor University and David Agnew of the Marine Stewardship Council concerning the recently published article, Catch Reconstructions Reveal that Global Marine Fisheries Catches are Higher than Reported and Declining” by Daniel Pauly and Dirk Zeller in Nature.

A new paper led by Daniel Pauly of the University of British Columbia that found global catch data, as reported to the FAO, to be significantly lower than the true catch numbers. “Global fish catches are falling three times faster than official UN figures suggest, according to a landmark new study, with overfishing to blame.”

400 researches spent the last decade accumulating missing global catch data from small-scale fisheries, sport fisheries, illegal fishing activity and fish discarded at sea, which FAO statistics, “rarely include.”

“Our results indicate that the decline is very strong and is not due to countries fishing less. It is due to countries having fished too much and having exhausted one fishery after another,” Pauly says.

Despite these findings, Pauly doesn’t expect countries to realize the need to rebuild stocks, primarily because the pressures to continue current fishing effort are too strong in the developing world. But this study will allow researchers to see the true problems more clearly and hopefully inform policy makers accordingly.

Comment by Michel J. Kaiser, Bangor University, @MicheljKaiser

Catch and stock status are two distinct measurement tools for evaluating a fishery, and suggesting inconsistent catch data is a definitive gauge of fishery health is an unreasonable indictment of the stock assessment process. Pauly and Zeller surmise that declining catches since 1996 could be a sign of fishery collapse. While they do acknowledge management changes as another possible factor, the context is misleading and important management efforts are not represented. The moratorium on cod landings is a good example – zero cod landings in the Northwest Atlantic does not mean there are zero cod in the water. Such distinctions are not apparent in the analysis.

Another key consideration missing from this paper is varying management capacity. European fisheries are managed more effectively and provide more complete data than Indian Ocean fisheries, for example. A study that aggregates global landings data is suspect because indeed landings data from loosely managed fisheries are suspect.

Finally the author’s estimated catch seems to mirror that of the official FAO catch data, ironically proving its legitimacy. “Official” FAO data is not considered to be completely accurate, but rather a proportionate depiction of global trends. Pauly’s trend line is almost identical, just shifted up the y axis, and thus fails to significantly alter our perception of global fisheries.

Michel J. Kaiser is a Professor of Marine Conservation Ecology at Bangor University. Find him on twitter here.

Comment by David Agnew, Director of Standards, Marine Stewardship Council

The analysis of such a massive amount of data is a monumental task, and I suspect that the broad conclusions are correct. However, as is usual with these sorts of analyses, when one gets to a level of detail where the actual assumptions can be examined, in an area in which one is knowledgeable, it is difficult to follow all the arguments.  The Antarctic catches “reconstruction” apparently is based on one Fisheries Centre report (2015 Volume 23 Number 1) and a paper on fishing down ecosystems (Polar Record; Ainley and Pauly 2014). The only “reconstruction” appears to be the addition of IUU and discard data, all of which are scrupulously reported by CCAMLR anyway, so they are not unknown. But there is an apparent 100,000 t “unreported” catch in the reconstruction in Figure 3, Atlantic, Antarctic (48). This cannot include the Falklands (part of the Fisheries Centre paper) and it is of a size that could only be an alleged misreporting of krill catch in 2009. This is perhaps an oblique reference to concerns that CCAMLR has had in the past about conversion factors applied to krill products, or perhaps unseen (net-impact) mortality, but neither of these elements have been substantiated, nor referenced in the supporting documentation that I have seen (although I could not access the polar record paper).

The paper does not go into much detail on these reasons for the observed declines in catches and discards, except to attribute it to both reductions in fishing mortality attendant on management action to reduce mortality and generate sustainability, and some reference to declines in areas that are not managed. It is noteworthy that the peak of the industrial catches – in the late 1990s/early 2000s – coincidentally aligns with the start of the recovery of many well managed stocks. This point of recovery has been documented previously (Costello et al 2012Rosenberg et al 2006; Gutierrez et al 2012) and particularly relates to the recovery of large numbers of stocks in the north Pacific, the north Atlantic and around Australia and New Zealand, and mostly to stocks that are assessed by analytical models. For stocks that need to begin recovery plans to achieve sustainability, this most often entails an overall reduction in fishing effort, which would be reflected in the reductions in catches seen here. So, one could attribute some of the decline in industrial catch in these regions to a correct management response to rebuild stocks to a sustainable status, although I have not directly analyzed the evidence for this. This is therefore a positive outcome worth reporting.

The above-reported inflection point is also coincident with the launch of the MSC’s sustainability standard. These standards have now been used to assess almost 300 fisheries, and have generated environmental improvements in most of them (MSC 2015). Stock sustainability is part of the requirements of the standard, and previous analyses (Gutierrez et al 2012Agnew et al 2012) have shown that certified fisheries have improved their stock status and achieved sustainability at a higher rate than uncertified fisheries. The MSC program does not claim responsibility for the turn-around in global stocks, but along with other actions – such as those taken by global bodies such as FAO, by national administrations, and by industry and non-Governmental Organisations – it can claim to have provided a significant incentive for fisheries to become, and then remain, certified.

David Agnew is the Director of Standards at the Marine Stewardship Council, the largest fishery sustainability ecolabel in the world. You can follow MSC on twitter.

Read the commentary at CFOOD


Read the original post: http://www.savingseafood.org/