New Legislation Poised to expand Computer Science

New legislation helmed by Connecticut Gov. Ned Lamont’s desk aims to develop computer science in all schools across the state in a bid to equip the students with the much-needed skills for tomorrow.

To this end, a question was asked, “Why do you think students should study computer science?” this question sparked a discussion in Chinma Uche’s class at the CREC Academy of Aerospace and Engineering in Windsor Thursday.

Since the students have all taken computer science, they saw it as a necessary skill for other students to acquire. “I think computer science should be in all schools because computers are everywhere in this world,” said student Nya Bentley.

“Computer science seems to be becoming a new fundamental skill. You learn, read, write, and soon code,” said Adittya Patil, a student in Uche’s class. The legislation is bound to expand computer science access in kindergartens and 12th-grade schools throughout the entire state.

Chinma Uche, a math and computer science teacher, says that she has seen the growth in interest among her students. “One of the things I saw in my class was students doing what they like and learning while doing it,” she said.

She says that there is a high demand for people with computer science skills. “I have students who graduated from our school and have gone ahead to work for big industries. Some have gone to work on projects for Apple, Google, just because they had a chance to take computer science in this school,” she said.

Some of the main goals in the legislation are to introduce computer science to students early enough. Shannon Marimon, executive director of the Connecticut Council of Education Reform, played a vital role in pushing the bill.

Source: https://www.wfsb.com/news/new-legislation-will-expand-computer-science-education/article_84d261c0-8dfe-11e9-a6a0-83dfcadfd91c.html

 

Advertisements

The Key To Computer Science are Teachers, Not Online Courses, Says Governor

Having a teacher in class teaching computer science rather than online courses is crucial in building students’ interest in the course, according to Gov. Asa Hutchinson. He said this during an event Monday, June 10 that celebrated Arkansas leadership in CS.

The event pooled educators from 30 states plus governors of South Carolina and Lowa. During his 2014 race, he vowed to introduce computer science in every school. His inspiration came from a project by her granddaughter, who made an app to run his campaign.

The proposal saw the light of day during his 2015 session. The law requires schools to offer CS either as a math or science credit. To facilitate this, the state provides $5 million every two years. It also includes cash prizes, pays training teachers, and grants for equipment, among other benefits.

Since the passing of the law students taking the course rose from 1,100 to 8,000 while the number of teachers teaching the discipline rose from 20 to 370. 63% of Arkansas schools have a student taking the course compared to 35% of state schools offering it.

This embracing of the course made Arkansas become a leader in student coding movement. And how else to oversee this success than Anthony Owen. He is the state director of computer science, and he was in attendance.

Despite the success, 37% of schools in Arkansas are yet to offer the course since no student is interested. Hutchinson said that it is due to their unbelief that they can’t make it in computer science that keeps them from pursuing the discipline.

Source: https://talkbusiness.net/2019/06/governor-teachers-not-online-course-the-key-to-computer-science-growth/

 

Scientific reproducibility does not equate to scientific truth, mathematical model finds

According to math model produced by a team from the University of Idaho, Reproducible scientific results are not always true and true scientific results are not always reproducible.

Researchers investigated the relationship between reproducibility and the discovery of scientific truths by building a mathematical model that represents a scientific community working toward finding a scientific truth. In each simulation, the scientists are asked to identify the shape of a specific polygon.

The modeled scientific community included multiple scientist types, each with a different research strategy, such as performing highly innovative experiments or simple replication experiments. Devezer and her colleagues studied whether factors like the makeup of the community, the complexity of the polygon and the rate of reproducibility influenced how fast the community settled on the true polygon shape as the scientific consensus and the persistence of the true polygon shape as the scientific consensus.

Within the model, the rate of reproducibility did not always correlate with the probability of identifying the truth, how fast the community identified the truth and whether the community stuck with the truth once they identified it. These findings indicate reproducible results are not synonymous with finding the truth, Devezer said.

Compared to other research strategies, highly innovative research tactics resulted in a quicker discovery of the truth.

“We found that, within the model, some research strategies that lead to reproducible results could actually slow down the scientific process, meaning reproducibility may not always be the best — or at least the only — indicator of good science,” said Erkan Buzbas, U of I assistant professor in the College of Science, Department of Statistical Science and a co-author on the paper. “Insisting on reproducibility as the only criterion might have undesirable consequences for scientific progress.”

Reference: https://www.sciencedaily.com/releases/2019/05/190515144008.htm

 

Statistical model could predict future disease outbreaks

A team from University of Georgia have teamed up to create a statistical method that may allow public health and infectious disease forecasters to better predict disease re-emergence.

In recent years, the reemergence of measles, mumps, polio, whooping cough and other vaccine-preventable diseases has sparked a refocus on emergency preparedness.

The researchers focused on “critical slowing down,” or the loss of stability that occurs in a system as a tipping point is reached. This slowing down can result from pathogen evolution, changes in contact rates of infected individuals, and declines in vaccination. All these changes may affect the spread of a disease, but they often take place gradually and without much consequence until a tipping point is crossed.

“We saw a need to improve the ways of measuring how well-controlled a disease is, which can be difficult to do in a very complex system, especially when we observe a small fraction of the true number of cases that occur,” said Eamon O’Dea, a postdoctoral researcher in Drake’s laboratory who focuses on disease ecology.

The team created a visualization that looks like a series of bowls with balls rolling in them. In the model, vaccine coverage affects the shallowness of the bowl and the speed of the ball rolling in it.

Very often, the conceptual side of science is not emphasized as much as it should be, and we were pleased to find the right visuals to help others understand the science.

If a computer model of a particular disease was sufficiently detailed and accurate, it would be possible to predict the course of an outbreak using simulation, researchers say.

“But if you don’t have a good model, as is often the case, then the statistics of critical slowing down might still give us early warning of an outbreak.”

Reference: https://www.sciencedaily.com/releases/2019/05/190521124653.htm

Mathematicians revive abandoned approach to Riemann Hypothesis

Over the last 50 years, there’s been many proposals s regarding the Riemann Hypothesis, but none of them have led to conquering the most famous open problem in mathematics. A new paper in the Proceedings of the National Academy of Sciences (PNAS) builds on the work of Johan Jensen and George Pólya, two of the most important mathematicians of the 20th century. It reveals a method to calculate the Jensen-Pólya polynomials — a formulation of the Riemann Hypothesis — not one at a time, but all at once.

Although the paper falls short of proving the Riemann Hypothesis, its consequences include previously open assertions which are known to follow from the Riemann Hypothesis, as well as some proofs of conjectures in other fields.

The idea for the paper was sparked two years ago by a “toy problem” that Ono presented as a “gift” to entertain Zagier during the lead-up to a math conference celebrating his 65th birthday. A toy problem is a scaled-down version of a bigger, more complicated problem that mathematicians are trying to solve.

The hypothesis is a vehicle to understand one of the greatest mysteries in number theory — the pattern underlying prime numbers. Although prime numbers are simple objects defined in elementary math (any number greater than 1 with no positive divisors other than 1 and itself) their distribution remains hidden.

For the PNAS paper, the authors devised a conceptual framework that combines the polynomials by degrees. This method enabled them to confirm the criterion for each degree 100 percent of the time, eclipsing the handful of cases that were previously known.

Despite their work, the results don’t rule out the possibility that the Riemann Hypothesis is false and the authors believe that a complete proof of the famous conjecture is still far off.

Reference: https://www.sciencedaily.com/releases/2019/05/190521162441.htm

 

Better together: human and robot co-workers

A lot of processes are being automated and digitised currently. Self-driving delivery vehicles, such as forklifts, are finding their way into many areas with many companies are reporting potential time and cost savings.

However, an interdisciplinary research team from the universities of Göttingen, Duisburg-Essen and Trier has observed that cooperation between humans and machines can work much better than just human or just robot teams alone. The results were published in the International Journal of Advanced Manufacturing Technologies.

The research team simulated a process from production logistics, such as the typical supply of materials for use in the car or engineering industries. A team of human drivers, a team of robots and a mixed team of humans and robots were assigned transport tasks using vehicles. The time they needed was measured. The results were that the mixed team of humans and robots were able to beat the other teams; this coordination of processes was most efficient and caused the fewest accidents. This was quite unexpected, as the highest levels of efficiency are often assumed to belong to those systems that are completely automated.

“This brings a crucial ray of hope when considering efficiency in all discussions involving automation and digitisation,” says the first author of the study, Professor Matthias Klumpp from the University of Göttingen.

The researchers from the various disciplines of business administration, computer science and sociology of work and industry highlighted the requirements for successful human-machine interaction. In many corporate and business situations, decisions will continue to be driven by people.

In conclusion, researchers say that companies should pay more attention to their employees in the technical implementation of automation.

Reference: https://www.sciencedaily.com/releases/2019/05/190524113529.htm

 

Exploring the Mathematical Universe

A team of mathematicians from 12 countries has begun charting the terrain of rich, new mathematical worlds. The mathematical universe is filled with both familiar and exotic items. The “L-functions and Modular Forms Database,” abbreviated LMFDB—a sophisticated web interface that allows both experts and amateurs to easily navigate its contents.

According to Benedict Gross, an emeritus professor of mathematics at Harvard University, “Number theory is a subject that is as old as written history itself. Throughout its development, numerical computations have proved critical to discoveries, including the prime number theorem, and more recently, the conjecture of Birch and Swinnerton-Dyer on elliptic curves. The LMFDB pulls together all of the amazing computations that have been done with these objects.

Prime numbers have fascinated mathematicians throughout the ages. The distribution of primes is believed to be random, but proving this remains beyond the grasp of mathematicians to date. Under the Riemann hypothesis, the distribution of primes is intimately related to the Riemann zeta function, which is the simplest example of an L-­function. The LMFDB contains more than twenty million L-­functions, each of which has an analogous Riemann hypothesis that is believed to govern the distribution of wide range of more exotic mathematical objects. Patterns found in the study of these L-­functions also arise in complex quantum systems, and there is a conjectured to be direct connection to quantum physics.

A recent contribution by Andrew Sutherland at MIT used 72,000 cores of Google’s Compute Engine to complete in one weekend a tabulation that would have taken more than a century on a single computer. The application of large-scale cloud computing to research in pure mathematics is just one of the ways in which the project is pushing forward the frontier of mathematics.

Reference: https://www.sciencedaily.com/releases/2016/05/160510084152.htm

 

Supercomputing For a Superproblem: A Computational Journey into Pure Mathematics

One of the most reputable and respected mathematician known to have solved one of the subject’s most challenging problems has published his latest work as a University of Leicester research report.

This follows the visit that famed mathematician Yuri Matiyasevich made to the Department of Mathematics where he talked about his pioneering work. He visited UK by invitation of the Isaac Newton Institute for Mathematical Sciences.

In 1900, twenty-three unsolved mathematical problems, known as Hilbert’s Problems, were compiled as a definitive list by mathematician David Hilbert.

A century later, the seven most important unsolved mathematical problems to date, known as the ‘Millennium Problems’, were listed by the Clay Mathematics Institute. Solving one of these Millennium Problems has a reward of US $1,000,000, and so far only one has been resolved, namely the famous Poincare Conjecture, which only recently was verified by G. Perelman.

Yuri Matiyasevich found a negative solution to one of Hilbert’s problems. Now, he’s working on the more challenging of maths problems — and the only one that appears on both lists — Riemann’s zeta function hypothesis.

Professor Alexander Gorban, from the University of Leicester, said: “His visit was a great event for our mathematics and computer science departments.

“Matiyasevich has now published a paper through the University that regards the zeros of Riemann Zeta Function (RZF). This is a mathematical function which has been studied for over a hundred years.

“There is previous evidence of famous pure mathematical problems using massive computations. Unfortunately, the Riemann hypothesis is not reduced to a finite problem and, therefore, the computations can disprove but cannot prove it. Computations here provide the tools for guessing and disproving the guesses only.”

Reference: https://www.sciencedaily.com/releases/2012/11/121106125558.htm

 

Computers Unlock More Secrets of the Mysterious Indus Valley Script

Lots of artefacts left by an urban civilization living on what is now the border between Pakistan and India, have been discovered. Now a team of Indian and American researchers are using mathematics and computer science to try to piece together information about the still-unknown script.

The team used computers to extract patterns in ancient Indus symbols. The study shows distinct patterns in the symbols’ placement in sequences and creates a statistical model for the unknown language.

Despite dozens of attempts, nobody has yet interpreted the Indus script. The symbols are found on tiny seals, tablets and amulets, left by people inhabiting the Indus Valley from about 2600 to 1900 B.C. Each artefact is inscribed with a sequence that is typically five to six symbols long.

The new study shows that the order of symbols is meaningful; taking one symbol from a sequence found on an artefact and changing its position produces a new sequence that has a much lower probability of belonging to the hypothetical language.

Seals with sequences of Indus symbols have been found as far away as West Asia, specifically Mesopotamia and site of modern-day Iraq. The statistical results showed that the West-Asian sequences are ordered differently from sequences on artifacts found in the Indus valley. This supports earlier theories that the script may have been used by Indus traders in West Asia to represent different information compared to the Indus region.

They used a Markov model, a statistical method that estimates the likelihood of a future event based on past patterns.

One application described in the paper uses the statistical model to fill in missing symbols on damaged archaeological artifacts. Such filled-in texts can increase the pool of data available for deciphering the writings of ancient civilizations.

Reference: https://www.sciencedaily.com/releases/2009/08/090803185836.htm

 

Computer Scientist Reveals the Math and Science behind Blockbuster Movies

It’s clear that the computer-generated special effects in Pirates of the Caribbean and others breathe life to such fantasies. Amazingly, the amount of math and science behind such blockbusters baffles even the adept scientist.

Computer graphics (CG) experts used to have to make a Catch-22 decision. They could run inferior algorithms on many processors or run the best algorithm on only one processor. The problem is that many algorithms do not scale well to larger numbers of processors. But about a year and a half ago Fedkiw, who has consulted for ILM for six years, figured out how to run a star algorithm on many processors, resulting in special effects unprecedented in their realism.

He designs new algorithms for diverse applications such as computational fluid dynamics and solid mechanics, computer graphics, computer vision and computational biomechanics. The algorithms may rotate objects, simulate textures, generate reflections or mimic collisions. Or they may mathematically stitch together slices of a falling water drop, rising smoke wisp or flickering flame to weave realism into CG images.

Fedkiw received screen credits for his work on Poseidon, on Terminator 3: Rise of the Machines for the liquid terminator and the nuclear explosions, and on Star Wars: Episode III—Revenge of the Sith for explosions in space battle scenes.

Most of Fedkiw’s students double-major in math and computer science. “Graphics itself is a bit less important, and many of them don’t take their first graphics class until their junior or senior year of college.

Fedkiw’s favorite movie employing CG is Revenge of the Sith. “When I watched the first [Star Wars film] at 9 years old, I never dreamed that I’d eventually be helping to make the last one.” He says.

Reference: https://news.stanford.edu/news/2007/april4/fed-040407.html