Presents, a Life with a Plan. My name is Karen Anastasia Placek, I am the author of this Google Blog. This is the story of my journey, a quest to understanding more than myself. The title of this blog, "The Secret of the Universe is Choice!; know decision" will be the next global slogan. Placed on T-shirts, Jackets, Sweatshirts, it really doesn't matter, 'cause a picture with my slogan is worth more than a thousand words, it's worth??.......Know Conversation!!!
46 million turkeys are eaten each Thanksgiving, 22 million on Christmas and 19 million turkeys on Easter.
In 2011, 736 million pounds of turkey were consumed in the United States.
Turkey consumption has increased 104% since 1970.
Since 1970, turkey production in the United States has increased nearly 110%.
In 2013, 242 million turkeys are expected to be raised in the United States.
In 2012, 253,500,000 turkeys were produced in the United States.
The turkey industry employs 20,000 to 25,000 persons in the United States.
In 1970, 50% of all turkey consumed was during the holidays, now just 29% of all turkey consumed is during the holidays as more turkey is eaten year-round.
In 2012, turkey was the # 4 protein choice for American consumers behind chicken, beef and pork
The top three turkey products sold in 2011 were whole birds, ground turkey and cooked white meat (deli meat).
Turkey hens are usually sold as whole birds. Toms are processed into turkey sausage, turkey franks, tenderloins, cutlets and deli meats.
In 2011, 47.4% of turkeys were sold to grocery stores and other retail outlets, 30% sold in commodity outlets, 15.5% sold to foodservice outlets and 6.2% were exported.
In 2011, 703.3 million pounds of turkey were exported.
The average weight of a turkey purchased at Thanksgiving is 15 pounds.
The heaviest turkey ever raised was 86 pounds, about the size of a large dog.
A 15 pound turkey usually has about 70 percent white meat and 30 percent dark meat.
The wild turkey is native to northern Mexico and the eastern United States.
The male turkey is called a tom.
The female turkey is called a hen.
The turkey was domesticated in Mexico and brought to Europe in the 16th century.
Tom turkeys have beards. That is comprised of black, hair-like feathers on their breast.
Canadians consumed 142 million kgs of turkey in the year 2012.
Turkeys can see movement almost 100 yards away.
Turkeys lived almost ten million years ago.
Baby turkeys are called poults and are tan and brown.
Turkey eggs are tan with brown specks and are larger than chicken eggs.
It takes 75-80 pounds of feed to raise a 30 pound tom turkey.
In 1920, U.S. turkey growers produced one turkey for every 29 persons in the U.S. Today growers produce nearly one turkey for every person in the country.
Male turkeys gobble. Hens do not. They make a clicking noise.
Minnesota, North Carolina, Arkansas, Missouri, Virginia, Indiana, California, South Carolina, Pennsylvania and Ohio were the leading producers of turkeys in 2011-2012.
Minnesota raised 46 million turkeys in 2012.
Illinois farmers produce close to 3 million turkeys each year.
A 16 week old turkey is called a fryer. A 5 to 7 month old turkey is called a young roaster and a yearling is a year old. Any turkey 15 months or older is called mature.
The ballroom dance the "Turkey Trot"was named for the short, jerky steps that turkeys take.
Turkeys do not really have ears like ours, but they have very good hearing.
Turkeys can see in color.
A large group of turkeys is called a flock.
Turkeys do not see well at night.
A domesticated male turkey can reach a weight of 30 pounds within 18 weeks after hatching.
Wild turkeys spend the night in trees. They prefer oak trees.
Wild turkeys were almost wiped out in the early 1900's. Today there are wild turkeys in every state except Alaska.
Wild turkeys can fly for short distances up to 55 mph and can run 20 mph.
Turkeys are believed to have been brought to Britain in 1526 by Yorkshire manWilliam Strickland. He acquired six turkeys from American Indian traders and sold them for tuppence in Bristol.
Henry VIII was the first English King to enjoy turkey and Edward VII made turkey eating fashionable at Christmas.
200 years ago in England, turkeys were walked to market in herds. They wore booties to protect their feet. Turkeys were also walked to market in the United States.
For 87% of people in the UK, Christmas wouldn't be Christmas without a traditional roast turkey.
Turkey breeding has caused turkey breasts to grow so large that the turkeys fall over.
June is National Turkey Lover’s Month.
Since 1947, the National Turkey Federation has presented a live turkey and two dressed turkeys to the President. The President does not eat the live turkey. He "pardons"it and allows it to live out its days on a historical farm.
The National Thanksgiving Turkey has been the Grand Marshall in the Thanksgiving Day Parade at both Disneyland Resort in California and Walt Disney World in Florida for the past four years.
The five most popular ways to serve leftover turkey are in a sandwich, stew, chili or soup, casseroles and as a burger.
Eating turkey does not cause you to feel sleepy after your Thanksgiving dinner. Carbohydrates in your Thanksgiving dinner are the likely cause of your sleepiness.
According to the 2007 Census, there were 8,284 turkey farms in the United States.
Turkey is low in fat and high in protein.
Turkey has more protein than chicken or beef.
White meat has fewer calories and less fat than dark meat.
Turkeys will have 3,500 feathers at maturity.
Turkeys have been bred to have white feathers. White feathers have no spots under the skin when plucked.
Most turkey feathers are composted.
Turkey feathers were used to stabilize arrows and adorn ceremonial dress, and the spurs on the legs of wild tom turkeys were used as projectiles on arrowheads.
Turkey skins can be tanned and used to make cowboy boots and belts.
The costume that "Big Bird"wears on Sesame Street is rumored to be made of turkey feathers.
The caruncle is a red-pink fleshy growth on the head and upper neck of the turkey.
Turkeys have a long, red, fleshy growth called the snood from the base of the beak that hangs down over the beak.
The bright red fleshy growth under a turkey’s throat is called a wattle.
The beard is a lock of hair found on the chest of the male turkey.
Giblets are the heart, liver, and gizzard of a poultry carcass. Although often packaged with them, the neck of the bird is not a giblet.
Turkey eggs hatch in 28 days.
The Native Americans hunted wild turkey for its sweet, juicy meat as early as 1000 A.D.
There are a number of towns in the United States named after the holiday’s traditional main course. Turkey, Texas, was the most populous in 2005, with 492 residents;followed by Turkey Creek, Louisiana (357);and Turkey, North Carolina (269). There also are 9 townships around the country named “Turkey,”3 in Kansas.
IT was a full house at the much anticipated English songwriter and singer James Blunt’s Moon Landing Tour Concert which took place at the Harare International Conference Centre (HICC) in Harare on Tuesday evening.
You are here:Home>Pictures: James Blunt performs in Zimbabwe
IT was a full house at the much anticipated English songwriter and singer James Blunt’s Moon Landing Tour Concert which took place at the Harare International Conference Centre (HICC) in Harare on Tuesday evening.
IT was full house at the much-anticipated English songwriter and singer James Blunt’s Moon Landing Tour Concert which took place at the Harare International Conference Centre (HICC) in Harare on Tuesday evening.
BY TINASHE SIBANDA
Despite Blunt’s consistency in churning out hit after hit that kept audiences singing along, organisers appeared to have done their homework on security.
Dressed in a pair of jeans and T-shirt together with his band gave the audience a good run for their money with lots of interaction as he would take time to crack a few jokes leaving them in stitches in the two-hour performance.
He performed hits including Where Are You Now, Beautiful Dawn Satellites, These Are The Words and Postcards among many others that made the audience forget it was a midweek show.
Blunt took time to tell the audience that he had been very much impressed by the crowd considering that it was only his first time performing in Zimbabwe.
It was at that moment he sang the hits Bonfire Heart, Goodbye My Lover and You’re Beautiful that the audience got into an even bigger frenzy of enjoyment.
“This is my favourite guitar which I even take to my bedroom. I use it to make me feel bigger,” he said showing the audience a tiny guitar which he went on to play with ease just as the others.
He also took his time off the stage to get even closer to his audience, crowd-surfing which made it even a more memorable affair for the audience.
Blunt rose to prominence in 2004 with the release of his debut studio album Back To Bedlam, before achieving worldwide fame with the singles You’re Beautiful and Goodbye My Lover.
The album sold over 11 million copies worldwide, topping the UK album charts and peaking at number two in the United States.
Blunt’s second album, All The Lost Souls, was released in 2007, topped the charts in over 20 countries and produced the hit single 1973.
His third album, Some Kind of Trouble, was released in 2010, after its lead single Stay the Night.
A deluxe edition was then released the following year, titled Some Kind of Trouble: Revisited.
Blunt has sold over 20 million albums and 17 million singles worldwide, with his debut album, Back to Bedlam, being listed as the best-selling album of the 2000s in the UK.
Blunt has also received several awards and nominations, having won two Brit Awards, two MTV Video Music Awards, two Ivor Novello Awards as well as receiving five Grammy Award nominations.
Ebola: Namibians told to avoid Zimbabwe
THE Namibian government has warned against travel to Zimbabwe following reports that Harare had quarantined over 100 people who returned from countries affected by the deadly Ebola outbreak in West Africa.
Speaking to local media from New York, the country’s health minister Richard Kamwi said he was not aware of the quarantine in Zimbabwe but warned Namibians not to visit that country.
“The incubation period for Ebola is 21 days and until they are over and the country has been declared safe, I advise Namibians not to visit and Zimbabweans not to come to Namibia,” he said.
Kamwi also maintained that no Namibian is yet allowed to visit Nigeria and Senegal.
This is despite the fact that the WHO on Monday declared that the Ebola virus had been “pretty much contained” in Nigeria and Senegal and declared the two countries Ebola-free.
“We do not know without doubt that it is contained, so we still need to take care of ourselves,” he said.
Last month the minister advised visitors from Ebola-hit countries not to visit Namibia until further notice and Namibians not to travel to such countries.
He said this advice does not affect Namibian citizens who are entitled to come back since they are guaranteed such rights in terms of the country's Constitution.
“However, such Namibians will be subjected to strict screening at the port of entry to ensure that they are not infected with Ebola,” Kamwi said.
Affected countries include Guinea, Liberia, Nigeria, and Sierra Leone and now Senegal, Zimbabwe as well as the Democratic Republic of Congo.
The Ebola virus has killed more than 2,800 people since March this year.
Lockdown in Harare over Ebola scare
WILKINS Infectious Diseases Hospital was on Thursday forced to close following suspicions that one of its patients was suffering from the deadly Ebola disease.
Patients who rely on the hospital for HIV/AIDS drugs and cancer therapy were left stranded after the early morning lockdown.
Wilkins’ medical superintendent, Hilda Bara, tried in vain to pacify the restive patients but was very clear on what was happening.
“We are managing a suspected Ebola case. The decision to shut this hospital down was made in order to protect you from contracting the disease if it is proven that indeed the patient is suffering from Ebola,” she said.
“We would like to ask you to understand the situation and to seek treatment at Beatrice hospital (Nazareth). Those who want to undergo cervical cancer screening should visit their local clinics. We have also sent our nurses to Nazareth and the officials at the institution are aware of the changes so they will assist you.”
The instruction was greeted with grumbling by the patients who expressed fears they might default on their therapy.
Others said they were only left with a days or week’s supply while some had booked to see doctors at the clinic and feared they could not get the same treatment elsewhere.
With the situation increasingly looking desperate Health Minister David Parirenyatwa later told journalists at Parirenyatwa Hospital where he was visiting that there was no need to panic.
“The issue here is that of public health versus public panic. We need to curb that. Let me emphasise that there is still no Ebola in Zimbabwe and we hope there will never be,” he said. “There have been several scares and I think we need to control it by educating not only the public but also our health professionals so that they don’t get scared.
“(The patient) came to this hospital with fever, and high temperature and was vomiting and bleeding but our doctors did malaria tests and she was positive, so she has malaria. To us it’s still a scare.
“We are sending the specimens to South Africa to test the effectiveness of our system, in case we have a case. However we are still treating the case as suspected Ebola while we wait for the results. We don’t have Ebola in Zimbabwe,” Parirenyatwa said.
While Parirenyatwa claimed the suspected Harare Polytechnic student had first been admitted at Parirenyatwa hospital after visiting her home country, DRC, officials at the health institution contradicted the minister.
Acting clinical director, Noah Madziva, said: “We never transferred any patient to Wilkins. All our patients are here.”
With social media abuzz with information that the patient was a student at the Harare Polytechnic and was from the central African country, city health director, Prosper Chonzi, could neither confirm nor deny the claims.
He however, confirmed the student had been under surveillance before she fell sick.
“It is highly unlikely that it is Ebola but we have a patient who is at Wilkins, someone from DRC who went to Parirenyatwa Hospital and they thought it was Ebola after she got there with a high temperature, vomiting and some nose bleed and yesterday was her 21st day in Zimbabwe. She was in Lubumbashi which is like 3 000km away from where Ebola is.
“We did rapid malaria tests and she tested positive so we are managing it as malaria but we are using this opportunity to try out the system, in case we have the virus in Zimbabwe,” Chonzi said.
He added: “We have already collected specimen samples to be tested in South Africa. The results will tell us in four hours whether it is Ebola or not, but we have so far ruled out Ebola.
“The action is only precautionary and meant to test how efficient our systems are in terms of speed and effectiveness.”
The disease has so far claimed over 3 000 lives in West Africa and has now caused panic in Spain, the United States and Australia.
Southern Africa has only had two confirmed cases of the disease in the DRC.
According to the World Health Organisation, five people are being infected every hour and the figure could double by November.
Questions have been raised over Zimbabwe’s preparedness with intermittent claims of patients having been diagnosed with the virus. All the claims have so far proved to be hoaxes.
Government Mismanagement, Corruption Risks Lives of Millions
NOVEMBER 19, 2013
(Harare) – The water and sanitation crisis inZimbabwe’s capital, Harare, places millions of residents at risk of waterborne disease, Human Rights Watch said in a report released today. Five years after cholera killed over 4,000 people and sickened 100,000 more, the conditions that allowed the epidemic to flourish persist in Harare’s hOne mother told Human Rights Watch: “We have one toilet for the whole house and there are 21 people who live here. The flushing system doesn’t work because there is no water, so we have to use buckets. When there isn’t any water for flushing we just use the bush.”
Corruption and mismanagement at the local and national levels of government exacerbate the situation, Human Rights Watch said. For example, city budget guidelines specify that most of the revenue from water must be ploughed back into the system for maintenance and improvement, but even government officials acknowledged that the money is diverted for other uses. As a result, key parts of the service delivery system, like purchasing water treatment chemicals, are not adequately funded, leading the city to produce less potable water.
Until the late 1980s, Zimbabwe had a functioning water system, with access to potable water for 85 percent of the population. In Harare, remnants of this system are visible in a complex piped water and sewage system to which many residents are connected. The piped infrastructure has not been maintained, however. The result of deterioration of the system combined with a significant increase in the population is that the water now runs only sporadically and is often contaminated.
“Everyone has a right to access a minimum amount of potable water,” Kasambala said. “The government’s inability to maintain the water system and its practice of disconnecting those unable to pay forces people to drink water from contaminated taps or from unprotected wells.”
The government should take a number of steps to improve Harare’s water and sanitation crisis, including investing in low-cost sanitation and water strategies. These include providing community toilets and pit latrines, and drilling and maintaining boreholes so that residents do not have to rely on contaminated sources. A sliding fee scale for municipal water should be put in place to provide affordable water for low-income families, and no home should be disconnected from the city water supply for lack of payment.
Earlier in 2013, the government announced a US$144 million loan from the Chinese government, with 46 Chinese engineers coming to Harare, to upgrade the water infrastructure primarily by improving the sewage treatment plants. While the government has promoted the loan as the solution to Harare’s water crisis, its terms have not been made public. Critics have decried the loan as exemplifying the lack of transparency and corruption in water and sewage services.
The government of Zimbabwe is obliged under international law to protect the right to water and sanitation, Human Rights Watch said. In 2010, Zimbabwe voted for a United Nations General Assembly resolution establishing the right to water and sanitation. The recognition of this right acknowledges that water and sanitation are crucial not only for health, but also for other key aspects of development, such as gender equality, education and economic growth. Zimbabwe’s constitution and domestic laws protect the right to water and, through protections concerning the environment, the right to sanitation. In May, Zimbabwe ratified a newconstitution that includes an explicit right to water.
“Harare’s water and sanitation system has been destroyed by decades of neglect and by ongoing mismanagement and corruption,” Kasambala said. “The 2008 cholera epidemic was a visible catastrophe, but less visible suffering, deaths, and indignity continue.igh-density suburbs.
The 60-page report, “Troubled Water: Burst Pipes, Contaminated Wells, and Open Defecation in Zimbabwe’s Capital,” describes how residents have little access to potable water and sanitation services, and often resort to drinking water from shallow, unprotected wells that are contaminated with sewage, and to defecating outdoors. The conditions violate their right to water, sanitation, and health. The report is based on research conducted in 2012 and 2013 in Harare, including 80 interviews with residents, mostly women, in eight high-density suburbs.
“Harare’s water and sanitation system is broken and the government isn’t fixing it,” said Tiseke Kasambala, Southern Africa director at Human Rights Watch. “In many communities there is no water for drinking or bathing, there is sewage in the streets, there is diarrhea and typhoid and the threat of another cholera epidemic.”
Many residents said that the lack of household water forced them to wait for water at boreholes for up to five hours a day, and that violence frequently erupted when lines were especially long. People believe these boreholes – 200 of which were drilled by international agencies during the cholera epidemic – are the safest water option available, yet one-third of boreholes tested in Harare by Harare Water, the city agency in charge of water, showed contamination.
Residents also said that the city charged them for municipal water even when the water flowed only sporadically or was contaminated. If people were unable to pay their bills, the city turned off their water supply.
Some residents described raw sewage flowing into their homes and streets from burst pipes, in which children frequently played. The water shortage and the lack of functioning indoor toilets or community latrines sometimes gave them no choice but to defecate outdoors.
Monte Carlo methods (or Monte Carlo experiments) are a broad class of computationalalgorithms that rely on repeated randomsampling to obtain numerical results. They are often used in physical and mathematical problems and are most useful when it is difficult or impossible to use other mathematical methods. Monte Carlo methods are mainly used in three distinct problem classes:optimization, numerical integration, and generation of draws from a probability distribution.
In physics-related problems, Monte Carlo methods are quite useful for simulating systems with many coupled degrees of freedom, such as fluids, disordered materials, strongly coupled solids, and cellular structures (see cellular Potts model). Other examples include modeling phenomena with significant uncertainty in inputs such as the calculation of risk in business and, in math, evaluation of multidimensional definite integrals with complicated boundary conditions. In application to space and oil exploration problems, Monte Carlo–based predictions of failure, cost overruns and schedule overruns are routinely better than human intuition or alternative "soft" methods.[1]
The modern version of the Monte Carlo method was invented in the late 1940s by Stanislaw Ulam, while he was working on nuclear weapons projects at the Los Alamos National Laboratory. Immediately after Ulam's breakthrough, John von Neumann understood its importance and programmed the ENIAC computer to carry out Monte Carlo calculations.
Monte Carlo method applied to approximating the value of π. After placing 30000 random points, the estimate for π is within 0.07% of the actual value. This happens with an approximate probability of 20%.
Monte Carlo methods vary, but tend to follow a particular pattern:
For example, consider a circle inscribed in a unit square. Given that the circle and the square have a ratio of areas that is π/4, the value ofπ can be approximated using a Monte Carlo method:[2]
Draw a square on the ground, then inscribe a circle within it.
Uniformly scatter some objects of uniform size (grains of rice or sand) over the square.
Count the number of objects inside the circle and the total number of objects.
The ratio of the two counts is an estimate of the ratio of the two areas, which is π/4. Multiply the result by 4 to estimate π.
In this procedure the domain of inputs is the square that circumscribes our circle. We generate random inputs by scattering grains over the square then perform a computation on each input (test whether it falls within the circle). Finally, we aggregate the results to obtain our final result, the approximation of π.
If the grains are not uniformly distributed, then our approximation will be poor. Secondly, there should be a large number of inputs. The approximation is generally poor if only a few grains are randomly dropped into the whole square. On average, the approximation improves as more grains are dropped.
Before the Monte Carlo method was developed, simulations tested a previously understood deterministic problem and statistical sampling was used to estimate uncertainties in the simulations. Monte Carlo simulations invert this approach, solving deterministic problems using a probabilisticanalog (see Simulated annealing).
An early variant of the Monte Carlo method can be seen in the Buffon's needle experiment, in which π can be estimated by dropping needles on a floor made of parallel and equidistant strips. In the 1930s, Enrico Fermi first experimented with the Monte Carlo method while studying neutron diffusion, but did not publish anything on it.[3]
In 1946, physicists at Los Alamos Scientific Laboratory were investigating radiation shielding and the distance that neutrons would likely travel through various materials. Despite having most of the necessary data, such as the average distance a neutron would travel in a substance before it collided with an atomic nucleus, and how much energy the neutron was likely to give off following a collision, the Los Alamos physicists were unable to solve the problem using conventional, deterministic mathematical methods. Stanislaw Ulam had the idea of using random experiments. He recounts his inspiration as follows:
The first thoughts and attempts I made to practice [the Monte Carlo Method] were suggested by a question which occurred to me in 1946 as I was convalescing from an illness and playing solitaires. The question was what are the chances that a Canfield solitaire laid out with 52 cards will come out successfully? After spending a lot of time trying to estimate them by pure combinatorial calculations, I wondered whether a more practical method than "abstract thinking" might not be to lay it out say one hundred times and simply observe and count the number of successful plays. This was already possible to envisage with the beginning of the new era of fast computers, and I immediately thought of problems of neutron diffusion and other questions of mathematical physics, and more generally how to change processes described by certain differential equations into an equivalent form interpretable as a succession of random operations. Later [in 1946], I described the idea to John von Neumann, and we began to plan actual calculations.
Being secret, the work of von Neumann and Ulam required a code name.[citation needed] A colleague of von Neumann and Ulam, Nicholas Metropolis, suggested using the nameMonte Carlo, which refers to the Monte Carlo Casino in Monaco where Ulam's uncle would borrow money from relatives to gamble.[3] Using lists of "truly random" random numbers was extremely slow, but von Neumann developed a way to calculate pseudorandom numbers, using the middle-square method. Though this method has been criticized as crude, von Neumann was aware of this: he justified it as being faster than any other method at his disposal, and also noted that when it went awry it did so obviously, unlike methods that could be subtly incorrect.
Monte Carlo methods were central to the simulations required for the Manhattan Project, though severely limited by the computational tools at the time. In the 1950s they were used at Los Alamos for early work relating to the development of the hydrogen bomb, and became popularized in the fields of physics, physical chemistry, and operations research. The Rand Corporation and the U.S. Air Force were two of the major organizations responsible for funding and disseminating information on Monte Carlo methods during this time, and they began to find a wide application in many different fields.
Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers that had been previously used for statistical sampling.
There is no consensus on how Monte Carlo should be defined. For example, Ripley[5] defines most probabilistic modeling as stochastic simulation, with Monte Carlo being reserved for Monte Carlo integration and Monte Carlo statistical tests. Sawilowsky[6] distinguishes between a simulation, a Monte Carlo method, and a Monte Carlo simulation: a simulation is a fictitious representation of reality, a Monte Carlo method is a technique that can be used to solve a mathematical or statistical problem, and a Monte Carlo simulation uses repeated sampling to determine the properties of some phenomenon (or behavior). Examples:
Simulation: Drawing one pseudo-random uniform variable from the interval [0,1] can be used to simulate the tossing of a coin: If the value is less than or equal to 0.50 designate the outcome as heads, but if the value is greater than 0.50 designate the outcome as tails. This is a simulation, but not a Monte Carlo simulation.
Monte Carlo method: Pouring out a box of coins on a table, and then computing the ratio of coins that land heads versus tails is a Monte Carlo method of determining the behavior of repeated coin tosses, but it is not a simulation.
Monte Carlo simulation: Drawing a large number of pseudo-random uniform variables from the interval [0,1], and assigning values less than or equal to 0.50 as heads and greater than 0.50 as tails, is a Monte Carlo simulation of the behavior of repeatedly tossing a coin.
Kalos and Whitlock[2] point out that such distinctions are not always easy to maintain. For example, the emission of radiation from atoms is a natural stochastic process. It can be simulated directly, or its average behavior can be described by stochastic equations that can themselves be solved using Monte Carlo methods. "Indeed, the same computer code can be viewed simultaneously as a 'natural simulation' or as a solution of the equations by natural sampling."
Monte Carlo simulation methods do not always require truly random numbers to be useful — while for some applications, such as primality testing, unpredictability is vital.[7] Many of the most useful techniques use deterministic, pseudorandom sequences, making it easy to test and re-run simulations. The only quality usually necessary to make goodsimulations is for the pseudo-random sequence to appear "random enough" in a certain sense.
What this means depends on the application, but typically they should pass a series of statistical tests. Testing that the numbers are uniformly distributed or follow another desired distribution when a large enough number of elements of the sequence are considered is one of the simplest, and most common ones. Weak correlations between successive samples is also often desirable/necessary.
Sawilowsky lists the characteristics of a high quality Monte Carlo simulation:[6]
the (pseudo-random) number generator has certain characteristics (e.g., a long "period" before the sequence repeats)
the (pseudo-random) number generator produces values that pass tests for randomness
there are enough samples to ensure accurate results
the proper sampling technique is used
the algorithm used is valid for what is being modeled
Low-discrepancy sequences are often used instead of random sampling from a space as they ensure even coverage and normally have a faster order of convergence than Monte Carlo simulations using random or pseudorandom sequences. Methods based on their use are called quasi-Monte Carlo methods.
Monte Carlo simulation versus "what if" scenarios[edit]
There are ways of using probabilities that are definitely not Monte Carlo simulations — for example, deterministic modeling using single-point estimates. Each uncertain variable within a model is assigned a “best guess” estimate. Scenarios (such as best, worst, or most likely case) for each input variable are chosen and the results recorded.[8]
By contrast, Monte Carlo simulations sample probability distribution for each variable to produce hundreds or thousands of possible outcomes. The results are analyzed to get probabilities of different outcomes occurring.[9] For example, a comparison of a spreadsheet cost construction model run using traditional “what if” scenarios, and then run again with Monte Carlo simulation and Triangular probability distributions shows that the Monte Carlo analysis has a narrower range than the “what if” analysis.[examples needed] This is because the “what if” analysis gives equal weight to all scenarios (see quantifying uncertainty in corporate finance), while Monte Carlo method hardly samples in the very low probability regions. The samples in such regions are called "rare events".
Monte Carlo methods are especially useful for simulating phenomena with significant uncertainty in inputs and systems with a large number of coupled degrees of freedom. Areas of application include:
Monte Carlo methods are widely used in engineering for sensitivity analysis and quantitative probabilistic analysis in process design. The need arises from the interactive, co-linear and non-linear behavior of typical process simulations. For example,
In wind energy yield analysis, the predicted energy output of a wind farm during its lifetime is calculated giving different levels of uncertainty (P90, P50, etc.)
impacts of pollution are simulated[16] and diesel compared with petrol.[17]
In telecommunications, when planning a wireless network, design must be proved to work for a wide variety of scenarios that depend mainly on the number of users, their locations and the services they want to use. Monte Carlo methods are typically used to generate these users and their states. The network performance is then evaluated and, if results are not satisfactory, the network design goes through an optimization process.
Monte Carlo methods are used in various fields of computational biology, for example for Bayesian inference in phylogeny, or for studying biological systems such as genomes, proteins,[20] or membranes.[21] The systems can be studied in the coarse-grained or ab initio frameworks depending on the desired accuracy. Computer simulations allow us to monitor the local environment of a particular molecule to see if some chemical reaction is happening for instance. We can also conduct thought experiments when the physical experiments are not feasible, for instance breaking bonds, introducing impurities at specific sites, changing the local/global structure, or introducing external fields.
Path Tracing, occasionally referred to as Monte Carlo Ray Tracing, renders a 3D scene by randomly tracing samples of possible light paths. Repeated sampling of any given pixel will eventually cause the average of the samples to converge on the correct solution of the rendering equation, making it one of the most physically accurate 3D graphics rendering methods in existence.
In applied statistics, Monte Carlo methods are generally used for two purposes:
To compare competing statistics for small samples under realistic data conditions. Although Type I error and power properties of statistics can be calculated for data drawn from classical theoretical distributions (e.g., normal curve, Cauchy distribution) for asymptotic conditions (i. e, infinite sample size and infinitesimally small treatment effect), real data often do not have such distributions.[22]
To provide implementations of hypothesis tests that are more efficient than exact tests such as permutation tests (which are often impossible to compute) while being more accurate than critical values for asymptotic distributions.
Monte Carlo methods are also a compromise between approximate randomization and permutation tests. An approximate randomization test is based on a specified subset of all permutations (which entails potentially enormous housekeeping of which permutations have been considered). The Monte Carlo approach is based on a specified number of randomly drawn permutations (exchanging a minor loss in precision if a permutation is drawn twice – or more frequently—for the efficiency of not having to track which permutations have already been selected).
Monte Carlo methods have been developed into a technique called Monte-Carlo tree search that is useful for searching for the best move in a game. Possible moves are organized in a search tree and a large number of random simulations are used to estimate the long-term potential of each move. A black box simulator represents the opponent's moves.[23]
The Monte Carlo Tree Search (MCTS) method has four steps:[24]
Starting at root node of the tree, select optimal child nodes until a leaf node is reached.
Expand the leaf node and choose one of its children.
Play a simulated game starting with that node.
Use the results of that simulated game to update the node and its ancestors.
The net effect, over the course of many simulated games, is that the value of a node representing a move will go up or down, hopefully corresponding to whether or not that node represents a good move.
Monte Carlo methods are also efficient in solving coupled integral differential equations of radiation fields and energy transport, and thus these methods have been used in global illumination computations that produce photo-realistic images of virtual 3D models, with applications in video games, architecture, design, computer generated films, and cinematic special effects.[30]
In general, Monte Carlo methods are used in mathematics to solve various problems by generating suitable random numbers (see also Random number generation) and observing that fraction of the numbers that obeys some property or properties. The method is useful for obtaining numerical solutions to problems too complicated to solve analytically. The most common application of the Monte Carlo method is Monte Carlo integration.
Monte-Carlo integration works by comparing random points with the value of the function
Errors reduce by a factor of
Deterministic numerical integration algorithms work well in a small number of dimensions, but encounter two problems when the functions have many variables. First, the number of function evaluations needed increases rapidly with the number of dimensions. For example, if 10 evaluations provide adequate accuracy in one dimension, then 10100 points are needed for 100 dimensions—far too many to be computed. This is called the curse of dimensionality. Second, the boundary of a multidimensional region may be very complicated, so it may not be feasible to reduce the problem to a series of nested one-dimensional integrals.[31] 100 dimensions is by no means unusual, since in many physical problems, a "dimension" is equivalent to a degree of freedom.
Monte Carlo methods provide a way out of this exponential increase in computation time. As long as the function in question is reasonablywell-behaved, it can be estimated by randomly selecting points in 100-dimensional space, and taking some kind of average of the function values at these points. By the central limit theorem, this method displays convergence—i.e., quadrupling the number of sampled points halves the error, regardless of the number of dimensions.[31]
A refinement of this method, known as importance sampling in statistics, involves sampling the points randomly, but more frequently where the integrand is large. To do this precisely one would have to already know the integral, but one can approximate the integral by an integral of a similar function or use adaptive routines such as stratified sampling, recursive stratified sampling, adaptive umbrella sampling[32][33] or the VEGAS algorithm.
A similar approach, the quasi-Monte Carlo method, uses low-discrepancy sequences. These sequences "fill" the area better and sample the most important points more frequently, so quasi-Monte Carlo methods can often converge on the integral more quickly.
Another powerful and very popular application for random numbers in numerical simulation is in numerical optimization. The problem is to minimize (or maximize) functions of some vector that often has a large number of dimensions. Many problems can be phrased in this way: for example, a computer chess program could be seen as trying to find the set of, say, 10 moves that produces the best evaluation function at the end. In the traveling salesman problem the goal is to minimize distance traveled. There are also applications to engineering design, such as multidisciplinary design optimization. It has been applied to solve particle dynamics simulation model Quasi-one-dimensional models to efficiently explore large configuration space.
The traveling salesman problem is what is called a conventional optimization problem. That is, all the facts (distances between each destination point) needed to determine the optimal path to follow are known with certainty and the goal is to run through the possible travel choices to come up with the one with the lowest total distance. However, let's assume that instead of wanting to minimize the total distance traveled to visit each desired destination, we wanted to minimize the total time needed to reach each destination. This goes beyond conventional optimization since travel time is inherently uncertain (traffic jams, time of day, etc.). As a result, to determine our optimal path we would want to use simulation - optimization to first understand the range of potential times it could take to go from one point to another (represented by a probability distribution in this case rather than a specific distance) and then optimize our travel decisions to identify the best path to follow taking that uncertainty into account.
Probabilistic formulation of inverse problems leads to the definition of a probability distribution in the model space. This probability distribution combines prior information with new information obtained by measuring some observable parameters (data). As, in the general case, the theory linking data with model parameters is nonlinear, the posterior probability in the model space may not be easy to describe (it may be multimodal, some moments may not be defined, etc.).
When analyzing an inverse problem, obtaining a maximum likelihood model is usually not sufficient, as we normally also wish to have information on the resolution power of the data. In the general case we may have a large number of model parameters, and an inspection of the marginal probability densities of interest may be impractical, or even useless. But it is possible to pseudorandomly generate a large collection of models according to the posterior probability distribution and to analyze and display the models in such a way that information on the relative likelihoods of model properties is conveyed to the spectator. This can be accomplished by means of an efficient Monte Carlo method, even in cases where no explicit formula for the a priori distribution is available.
The best-known importance sampling method, the Metropolis algorithm, can be generalized, and this gives a method that allows analysis of (possibly highly nonlinear) inverse problems with complex a priori information and data with an arbitrary noise distribution.[34][35]
Monte Carlo methods are very popular in hydrocarbon reservoir management in the context of nonlinear inverse problems. This includes generating computational models of oil and gas reservoirs for consistency with observed production data. For the goal of decision making and uncertainty assessment, Monte Carlo methods are used for generating multiple geological realizations.....