Page 4,096«..1020..4,0954,0964,0974,098..4,1104,120..»

MINI John Cooper Works GP is a two-seater hot hatch that shouts its 306 HP – SlashGear

Small, but perfectly formed. MINI isnt short on punchy versions in its history books, but even then the new 2020 MINI John Cooper Works GP stands out of the crowd. Offering the most powerful engine in a MINI to-date, the 306 horsepower two-door doesnt hide its unique nature.

In fact it positively shouts its, thanks to a distinctive body kid. MINI starts with the 3-door hatchback, and then adds a number of functional aero details. Theres a big roof spoiler, for example, with double wing contour, plus a new front apron surround and front spoiler lip. They help cut down on lift.

Larger air intakes are on the front, and there are carbon fiber-reinforced plastic arch trim flares. They mean MINI could use larger wheels 18-inches shod in 225/35 R 18 tires as well as optimize the air ducting on the sides. Its actually a first for BMW Group, which takes recycled CFRP fleece material from the production of the BMW i3 and i8, and then reworks it into body parts for the John Cooper Works GP. They get a matte CFRP coating, too, and are individually numbered.

Racing Grey Metallic paint sits alongside Melting Silver for the roof and side mirror caps. Theres high-gloss Chili Red highlighting on the grille, the lower air intakes, and inside the roof spoiler, while the badging is in metallic matte Rosso Red. The headlamps and Union Jack taillamp clusters are dark-tinted.

The changes arent just skin-deep, mind. Compared to the regular car, MINI has added a new engine mount, a solid support for the changed rear axle member, and a strut brace for the front suspension, all to improve stiffness. Theres a mechanical differential lock in the 8-speed Steptronic transmission, which can be locked by up to 31-percent and improve traction in cornering.

As for the engine, thats something special too. The 4-cylinder 2.0-liter twin-turbo engine delivers 306 horsepower 75 hp more than in the MINI John Cooper Works and 332 lb-ft of torque. 0-60 mph arrives in 5.2 seconds, and the top speed is an unrestricted 165 mph.

MINI throws in a model-specific engine oil sump, capable of holding up in even more aggressive cornering. The sports brake system has 4-piston fixed-caliper ventilated-disc brakes on the front wheels and single-piston floating-caliper brakes on the rear. The car is lowered by 10mm versus the standard John Cooper Works model.

Inside, there are two seats with Dinamica/leather trim and red belts. A special John Cooper Works steering wheel has 3D printed detailing and is leather-wrapped, and there are 3D printed metal shift paddles. Automatic climate control is optional, as is an infotainment system with navigation; the Connected Media system with a 6.5-inch screen is standard, and theres a digital instrument cluster on the steering column. That 5-inch color LCD shows things like speed and other metrics.

Now for the sad news. MINI will only be making 3,000 of the John Cooper Works GP model when production kicks off in March 2020, and that number will have to be shared with would-be drivers around the world. Pricing will be confirmed closer to release, but its fair to say that this stands a good chance of becoming a modern MINI classic.

See the rest here:

MINI John Cooper Works GP is a two-seater hot hatch that shouts its 306 HP - SlashGear

Read More..

The Universe Speaks in Numbers: The deep relationship between math and physics – The Huntington News

From the breakthroughs of Einstein and Dirac to contemporary physicists and mathematicians who are shedding light on the blossoming and revolutionary interaction between mathematics and physics, Graham Farmelos new book takes readers on an adventure through the two fields relationship.

Farmelo, a renowned physicist and writer, led a discussion surrounding topics covered in his book, The Universe Speaks in Numbers: How Modern Maths Reveals Natures Deepest Secrets, Nov. 14 at Snell Engineering Center. Farmelo is a fellow at Churchill College at the University of Cambridge as well as an affiliated professor at Northeastern.

Nima Arkani-Hamed, one of the nations leading theoretical physicists and a professor in the School of Natural Sciences at the Institute for Advanced Study in Princeton, New Jersey, was Farmelos guest speaker for the event. He arrived a couple of minutes late to the event and still received a round of applause from the audience as he walked through the door.

The talk began with Farmelo introducing Arkani-Hamed as the best theoretical physicist ever produced by McDonalds, referencing the fact that Arkani-Hamed had worked two summers in the fast-food chain in his youth.

Arkani-Hamed explained that within the independent development in physics and mathematics, theres more and more understanding of the deep and mysterious relationship between them. He said order can be made of our seemingly chaotic world and could be captured in succinct mathematical language.

Farmelo shared some of the foundational history of the field, beginning with Sir Isaac Newton, most well-known for his development of the three laws of motion. Farmelo explained that Newton wouldve never described himself as a theoretical physicist. Newton became one of the first to think that physicists should aim to make predictions about the world and that they could do so using defined mathematical calculations.

This proposal was a radical agenda for the scientific community at the time, though it has since laid the foundation for the impetus of physics that was to follow, Farmelo said.

Arkani-Hamed then interjected, saying one of his pet peeves is when people talk about theorists who are proven to be wrong about a certain theory, as if they are completely dumb, irrational or illogical for thinking that way.

He gave the example of the theory of luminiferous ether, a hypothetical medium for transmitting light and radiation, filling all unoccupied space. Ultimately, Einsteins theory of relativity eliminated the need for a light-transmitting medium, disproving the existence of the ether.

The luminiferous ether was a concept proposed by Newton who is, as aforementioned, a world-renowned physicist. Though Newton build the wrong scaffolding, it still lent way to the correct equations eventually, Arkani-Hamed said.

According to Arkani-Hamed, quantum mechanics is the most revolutionary theory of the 20th century. Put simply, quantum mechanics is the application of quantum theory: the theoretical basis of modern physics that explains the nature and behavior of matter and energy on the atomic and subatomic level.

To give an example of how mathematics and physics are intrinsically intertwined, Farmelo and Arkani-Hamed talked about how British physicist Paul Diracs work gave way to something once thought unfeasible.

In 1928, Dirac wrote an equation that combined quantum theory and special relativity to describe the behavior of an electron moving at a relativistic speed. This equation posed a problem in the classical physics practice. Even Einstein is known to have said he couldnt imagine any points of intersection between these two fundamental theories.

Dirac proposed the concept of antiparticles, which are particles that have all the same qualities as another particle but have an opposite electrical charge. This is the only way known to current physicists on how to successfully marry the theories of quantum mechanics and special relativity.

This intersection between two outrageously different theories is a relatively new revelation in the field of physics, and speaks to the deep, intricate and seemingly ever-growing connection between the worlds of mathematics and physics.

There is a giant Truth with a capital T of the world out there that physics is constantly working towards, that is also somehow enlaced with a giant Truth with a capital T of the mathematical world, Arkani-Hamed said.

Both Farmelo and Arkani-Hamed said they are excited by the impact of these two fields work in collaboration and believe there is much more to discover from their combination.

We dont actually know what reality is about. We are still learning, and we have to go into it with an open mind. Often, what we predict or assume something to be, can more easily be proven wrong than proven right, Arkani-Hamed said.

Read more:

The Universe Speaks in Numbers: The deep relationship between math and physics - The Huntington News

Read More..

The San Francisco Gay Mens Chorus Toured the Deep South – SF Weekly

About 15 minutes into Gay Chorus Deep South, members of the San Francisco Gay Mens Chorus are gathered around an office phone. The phone emits a brief screech before a voicemail marred with static plays: So, youre going to bring a fight to the South?

What follows next is a slurry of homophobia from the voicemail leaver. Chris Verdugo, the executive director of the chorus, just shakes his head.

Theres no amount of singing thats going to fix that, Verdugo says.

This fight was actually the 2017 fall Lavender Pen Tour of the American South, when 300 members of the San Francisco Gay Mens Chorus and guests from the Oakland Interfaith Gospel Choir drove through Mississippi, Alabama, Tennessee, South Carolina, and North Carolina to sing. They made about two dozen appearances, spoke on a conservative radio show, and found other members of the LGBTQ+ community, hoping to find unity in some places with the worst anti-LGBTQ laws in the country.

Its all captured in an upcoming documentary directed by David Charles Rodrigues, Gay Chorus Deep South, which won the Documentary Audience Award at the 2019 Tribeca Film Festival.

The documentary isnt a simple story of spreading joy and peace through the power of music. For one, the documentary challenges conceptions we have about the South. Queer South historian Josh Burford calls the south a myth a distancing technique.

There are many people who want this tour to happen, Burford says in the film. Lets bring attention to the needs of the local communities. Thats a great idea, he tells SF Weekly. But then the other side of it is after being ignored for long, the idea of a prominent, national organization doing a goodwill tour in the South feels very paternalistic and condescending.

The situation is a lot more complex than anticipated. Moreover, Gay Chorus Deep South unfolds the limitations and strengths of what art can do. Tim Seelig, the artistic director of the San Francisco Gay Mens Chorus, knew that there was no chance in changing the voicemail senders mind. (That person would actually go on to leave several more hate-filled messages, not seen on screen.)

But Seelig isnt trying to reach people like that.

The chorus in general, not just on this tour have people on one side of the spectrum who are totally supportive, accepting allies. On the other side there are people who are never going to listen to what were going to say, Seelig says. But theres a huge pool of people in the middle.

Singer Jimmy Whites father fell into that pool. White, who carries an ongoing battle with cancer during the tour, hasnt talked to his homophobic father in half a dozen years. But when the tour visits Mississippi, where White is from, Whites father is in the audience. Its unclear if they ever fully repair their relationship, but for a single concert, Whites father listens to his son sing in a gay mens chorus.

Changing hearts was one goal of the tour. Another was to support other LGBTQ people in the South. We created a safe space in those places that are not always safe to be out, Seelig says.

Its not always easy to do. That visibility in itself is activism, according to producer Bud Johnston. Its something that the makers of Gay Chorus Deep South are trying to amplify through documentation by shining a light on organizations like the San Francisco Gay Mens Chorus.

I think every one of the chorus members has trauma and has stories that are hard, whether it be their coming out stories, or how they were raised, Johnson says. But theyre the ones who are going out on a stage under a spotlight and singing their hearts out putting themselves in front of audiences that can be discriminatory. Thats courage.

Opens Nov. 22.

Grace Li covers arts and culture for SF Weekly. You can reach her at gli@sfweekly.com.

Read more:

The San Francisco Gay Mens Chorus Toured the Deep South - SF Weekly

Read More..

Tremor patients can be relieved of the shakes for THREE YEARS after having ultrasound waves – Herald Publicist

Folks whose lives are blighted by fixed shakes might be spared painful surgical procedure by zapping their brains with ultrasound waves, analysis suggests.

Scientists discovered treating important tremor sufferers with high-frequency sound waves stored their signs at bay for 3 years with none severe negative effects.

1000s of sufferers with extreme tremors at present depend on deep mind stimulation, which includes surgically implanting electrodes into the mind.

Nevertheless, it may well set off a slew of nasty negative effects together with migraines, nausea and hassle concentrating.

Folks whose lives are blighted by fixed shakes might be spared the nasty negative effects of surgical procedure by zapping their brains with ultrasound waves, analysis suggests (file)

The shaking is brought on by defective circuits within the thalamus, a small space on the base of the mind.The ultrasound therapy targets it with high-frequency sound waves.

These ultrasound beams generate warmth that breaks the irregular circuit inflicting the tremor.

The remedy was permitted by the UK watchdog NICE final June and is at present being trialled on the Imperial School Healthcare NHS Belief.

It has additionally been given the inexperienced gentle within the US and goes by means of medical trials to show it may well work.

For his or her newest research, researchers fromStanford College in California checked out76 important tremor sufferers with a mean age of 71.

Some 56 of the sufferers obtained the therapy. The opposite 20 had a placebo. All of them had been then adopted for 3 years.

Hand tremors, degree of incapacity and high quality of life had been measured initially of the research, after six months, one 12 months, two years after which lastly three years.

A nerve dysfunction which sees uncontrollable shaking in several elements of the physique.

Areas affected usually embrace the fingers, arms, head, larynx (voice field), tongue, and chin. The decrease physique isnt affected.

It impacts round 1,000,000 Britons and 7 million folks within the US.

ET is just not a life-threatening dysfunction, except it prevents an individual from caring for him or herself.

Most individuals are in a position to dwell regular lives with this situation though they might discover on a regular basis actions like consuming, dressing, or writing tough.

Its only when the tremors change into extreme that they really trigger incapacity.

What Causes Important Tremor?

The true trigger remains to be not understood, however its thought that the irregular electrical mind exercise that causes tremor is processed by means of the thalamus.

The thalamus is a construction deep within the mind that coordinates and controls muscle exercise.

Genetics is liable for inflicting ET in half of all folks with the situation.

A toddler born to a mum or dad with ET can have as much as a 50 per cent probability of inheriting the accountable gene, however might by no means truly expertise signs.

Though ET is extra frequent within the aged and signs change into extra pronounced with age it isnt part of the pure getting older course of.

Supply: WebMB

On the finish of the research, the contributors noticed hand tremors improved by 50 per cent, incapacity by 56 per cent, and high quality of life by 42 per cent, within the ultrasound group.

All negative effects within the research had been gentle or average. They included numbness and tingling, imbalance and unsteadiness.

In comparison with the scores six months after therapy, hand tremors and incapacity elevated barely after three years. On a scale of zero to 32, hand tremor scores initially had been a mean of 20.

At six months common scores had been 9 and by three years had been 10. For incapacity, on a scale of zero to 32, scores had been initially a mean of 16.At six months, scores had been a mean of 4 and at three years a mean of six.

Nevertheless, the research didnt evaluate the contributors to a placebo group.

Leadcreator Casey Halpern, assistant professor of neurosurgery at Stanford, mentioned: For individuals who have disabling important tremor that isnt responding to treatment, this therapy must be thought-about as a protected and efficient choice.

He mentioned that in comparison withdeep mind stimulation, the brand new method was a lot much less invasive.

Professor Halpern added: Its carried out in a single session; there isnt a want for follow-up visits.

He famous that, as a result of the folks within the research and the researchers all knew that everybody was receiving the therapy, extra analysis is required with a placebo group to substantiate the outcomes.

An additional limitation of the research was that 23 folks, or 31 per cent, didnt full the whole three years.

The researchers decided that those that later dropped out of the research werent responding as nicely to the therapy after three months as those that accomplished the research.

The findings had been revealed inthe medical journal of the American Academy of Neurology.

A million Britons and 7 million folks within the US endure from uncontrollable shaking, which might make even easy duties like consuming and utilizing a telephone tough.

Important tremor, because its identified, impacts the fingers, arms, head and voice field. It largely strikes folks in center age.

If signs are gentle, the situation is handled withbeta-blockers, coronary heart slowing drugs which block the manufacturing of adrenaline, or epilepsy medicine.

However these drugs solely scale back signs in round half of sufferers, in keeping with the NHS.

The rest is here:

Tremor patients can be relieved of the shakes for THREE YEARS after having ultrasound waves - Herald Publicist

Read More..

To Understand The Future of AI, Study Its Past – Forbes

Dr. Claude Shannon, one of the pioneers of the field of artificial intelligence, with an electronic ... [+] mouse designed to navigate its way around a maze after only one 'training' run. May 10, 1952 at Bell Laboratories. (Photo by Keystone/Getty Images)

A schism lies at the heart of the field of artificial intelligence. Since its inception, the field has been defined by an intellectual tug-of-war between two opposing philosophies: connectionism and symbolism. These two camps have deeply divergent visions as to how to "solve" intelligence, with differing research agendas and sometimes bitter relations.

Today, connectionism dominates the world of AI. The emergence of deep learning, which is a quintessentially connectionist technique, has driven the worldwide explosion in AI activity and funding over the past decade. Deep learning's recent accomplishments have been nothing short of astonishing. Yet as deep learning spreads, its limitations are becoming increasingly evident.

If AI is to reach its full potential going forward, a reconciliation between connectionism and symbolism is essential. Thankfully, in both academic and commercial settings, research efforts that fuse these two traditionally opposed approaches are beginning to emerge. Such synthesis may well represent the future of artificial intelligence.

Symbolic approaches to AI seek to build systems that behave intelligently through the manipulation of symbols that map directly to conceptsfor instance, words and numbers. Connectionist approaches, meanwhile, represent information and simulate intelligence via massive networks of interconnected processing units (commonly referred to as neural networks), rather than explicitly with symbols.

In many respects, connectionism and symbolism represent each others yin and yang: each approach has core strengths which for the other are important weaknesses. Neural networks develop flexible, bottoms-up intuition based on the data they are fed. Their millions of interconnected "neurons" allow them to be highly sensitive to gradations and ambiguities in input; their plasticity allows them to learn in response to new information.

But because they are not explicitly programmed by humans, neural networks are "black boxes": it is generally not possible to pinpoint, in terms that are meaningful to humans, why they make the decisions that they do. This lack of explainability is a fundamental impediment to the widespread use of connectionist methods in high-stakes real-world environments.

Symbolic systems do not have this problem. Because these systems operate with high-level symbols to which discrete meanings are attached, their logic and inner workings are human-readable. The tradeoff is that symbolic systems are more static and brittle. Their performance tends to break down when confronted with situations that they have not been explicitly programmed to handle. The real world is complex and heterogeneous, full of fuzzily defined concepts and novel situations. Symbolic AI is ill-suited to grapple with this complexity.

At its inception, the field of artificial intelligence was dominated by symbolism. As a serious academic discipline, artificial intelligence traces its roots to the summer of 1956, when a small group of academics (including future AI icons like Claude Shannon, Marvin Minsky and John McCarthy) organized a two-month research workshop on the topic at Dartmouth College. As is evident in the group's original research proposal from that summer, these AI pioneers' conception of intelligence was oriented around symbolic theories and methods.

Throughout the 1960s and into the 1970s, symbolic approaches to AI predominated. Famous early AI projects like Eliza and SHRDLU are illustrative examples. These programs were designed to interact with humans using natural language (within carefully prescribed parameters). For instance, SHRDLU could successfully respond to human queries like: Is there a large block behind a pyramid? or "What does the box contain?"

At the same time that symbolic AI research was showing early signs of promise, nascent efforts to explore connectionist paths to AI were shut down in dramatic fashion. In 1969, in response to early research on artificial neural networks, leading AI scholars Marvin Minsky and Seymour Papert published a landmark book called Perceptrons. The book set forth mathematical proofs that seemed to establish that neural networks were not capable of executing certain basic mathematical functions.

Perceptrons impact was sweeping: the AI research community took the analysis as authoritative evidence that connectionist methods were an unproductive path forward in AI. As a consequence, neural networks all but disappeared from the AI research agenda for over a decade.

Yet despite its early momentum, it would soon become clear that symbolic AI had profound shortcomings of its own.

Symbolic AI reached its mainstream zenith in the early 1980s with the proliferation of what were called expert systems: computer programs that, using extensive if-then logic, sought to codify the knowledge and decision-making of human experts in particular domains. These systems generated tremendous expectations and hype: startups like Teknowledge and Intellicorp raised millions and Fortune 500 companies invested billions in attempts to commercialize the technology.

Expert systems failed spectacularly to deliver on these expectations, due to the shortcomings noted above: their brittleness, inflexibility and inability to learn. By 1987 the market for expert systems had all but collapsed. An "AI winter" set in that would stretch into the new century.

Amid the ashes of the discredited symbolic AI paradigm, a revival of connectionist methods began to take shape in the late 1980sa revival that has reached full bloom in the present day. In 1986 Geoffrey Hinton published a landmark paper introducing backpropagation, a new method for training neural networks that has become the foundation for modern deep learning. As early as 1989, Yann LeCun had built neural networks using backpropagation that could reliably read handwritten zip codes for the U.S. Postal Service.

But these early neural networks were impractical to train and could not scale. Through the 1990s and into the 2000s, Hinton, LeCun and other connectionist pioneers persisted in their work on neural networks in relative obscurity. In just the past decade, a confluence of technology developmentsexponentially increased computing capabilities, larger data sets, and new types of microprocessorshave supercharged these connectionist methods first devised in the 1980s. These forces have catapulted neural networks out of the research lab to the center of the global economy.

Yet for all of its successes, deep learning has meaningful shortcomings. Connectionism is at heart a correlative methodology: it recognizes patterns in historical data and makes predictions accordingly, nothing more. Neural networks do not develop semantic models about their environment; they cannot reason or think abstractly; they do not have any meaningful understanding of their inputs and outputs. Because neural networks inner workings are not semantically grounded, they are inscrutable to humans.

Importantly, these failings correspond directly to symbolic AI's defining characteristics: symbolic systems are human-readable and logic-based.

Recognizing the promise of a hybrid approach, AI researchers around the world have begun to pursue research efforts that represent a reconciliation of connectionist and symbolic methods.

DARPA

To take one example, in 2017 DARPA launched a program called Explainable Artificial Intelligence (XAI). XAI is providing funding to 13 research teams across the country to develop new AI methods that are more interpretable than traditional neural networks.

Some of these research teams are focused on incorporating symbolic elements into the architecture of neural networks. Other teams are going further still, developing purely symbolic AI methods.

Autonomous vehicles

Another example of the merits of a dual connectionist/symbolic approach comes from the development of autonomous vehicles.

A few years ago, it was not uncommon for AV researchers to speak of pursuing a purely connectionist approach to vehicle autonomy: developing an "end-to-end" neural network that would take raw sensor data as input and generate vehicle controls as output, with everything in between left to the opaque workings of the model.

As of 2016, prominent AV developers like Nvidia and Drive.ai were building end-to-end deep learning solutions. Yet as research efforts have progressed, consensus has developed across the industry that connectionist-only methods are not workable for the commercial deployment of AVs.

The reason is simple: for an activity as ubiquitous and safety-critical as driving, it is not practicable to use AI systems whose actions cannot be closely scrutinized and explained. Regulators across the country have made clear that an AV systems inability to account for its own decisions is a non-starter.

Today, the dominant (perhaps the exclusive) technological approach among AV programs is to combine neural networks with symbolically-based features in order to increase model transparency.

Most often, this is achieved by breaking the overall AV cognition pipeline into modules: e.g., perception, prediction, planning, actuation. Within a given module, neural networks are deployed in targeted ways. But layered on top of these individual modules is a symbolic framework that integrates the various components and validates the systems overall output.

Academia

Finally, at leading academic institutions around the world, researchers are pioneering cutting-edge hybrid AI models to capitalize on the complementary strengths of the two paradigms. Notable examples include a 2018 research effort at DeepMind and a 2019 program led by Josh Tenenbaum at MIT.

In a fitting summary, NYU professor Brenden Lake said of the MIT research: Neural pattern recognition allows the system to see, while symbolic programs allow the system to reason. Together, the approach goes beyond what current deep learning systems can do.

Taking a step back, we would do well to remember that the human mind, that original source of intelligence that has inspired the entire AI enterprise, is at once deeply connectionist and deeply symbolic.

Anatomically, thoughts and memories are not discretely represented but rather distributed in parallel across the brains billions of interconnected neurons. At the same time, human intelligence is characterized at the level of consciousness by the ability to express and manipulate independently meaningful symbols. As philosopher Charles Sanders Peirce put it, We think only in signs.

Any conception of human intelligence that lacked either a robust connectionist or a robust symbolic dimension would be woefully incomplete. The same may prove to be true of machine intelligence. As dazzling as the connectionist-driven advances in AI have been over the past decade, they may be but a prelude to what becomes possible when the discipline more fully harmonizes connectionism and symbolism.

Go here to see the original:

To Understand The Future of AI, Study Its Past - Forbes

Read More..

Health strategies of Google, Amazon, Apple, and Microsoft – Business Insider

Dr. David Feinberg, the head of Google Health. Reuters

Over the past year, Google has gotten deeper into healthcare, hiring Dr. David Feinberg to head up the Google Health division.

A big Google health project is now drawing scrutiny. Google teamed up with the health system Ascension on "Project Nightingale," in which the hospital operator is using Google as its cloud provider and also working with the tech giant to build out tools the health system can use.

Business Insider reported on Monday that by 2020, records on 50 million Ascension patients will be on Google's cloud network. About 150 Google employees are able to access the data, according to documents seen by Business Insider.

The project drew concern from those inside the company, lawmakers, and the US Department of Health and Human Services about how the data is being handled. Google and Ascension said that the relationship followed health-privacy laws.

More broadly, Feinberg's team is now responsible for coordinating health initiatives across Google, including in the company's search-engine and map products, its Android smartphone operating system, and its more futuristic offerings in areas like artificial intelligence.

In his speech at a conference in October, Feinberg said one of his first main goals for the team would be to oversee how health-related searches come up and work to improve that with the Google Search team. According to documents reviewed by Business Insider, it appears the team has been building a Patient Search tool to help medical providers sift through patient information.

Read more: We just got our first look at what Google's grand plans are for healthcare after it brought in a top doctor to lead its health team

Google Health is just one aspect of the healthcare strategy of its parent company, Alphabet. Within Google, Google Cloud is working to sign cloud contracts with healthcare systems. Mayo Clinic in September signed Google as its cloud and AI partner.

There's also Verily, the life-sciences arm of Alphabet, as well as Calico, its life-extension spin-off. Verily has its hands in projects spanning robotics, blood-sugar-tracking devices, and work on addiction treatment. The company has also made investments in healthcare through its venture funds GV and Capital G, as well as through Alphabet itself.

Google in November also reached a $2.1 billion deal to acquire Fitbit. The brand, best known for its fitness watches, also has a big business selling a health platform that combines coaching and fitness tracking with employers and health plans.

Beyond working with existing products, Feinberg's oversight includes the health team at Google AI, hardware components, and DeepMind Health. Both Google AI and DeepMind have pursued projects that analyze medical images like eye scans and scans of breast cancer cells, with the hope of aiding medical professionals in diagnosing and treating patients.

Link:

Health strategies of Google, Amazon, Apple, and Microsoft - Business Insider

Read More..

Global Cloud Security Market Size is Expected to Reach 8.9 Billion US$ with a CAGR of 23.5% During the Forecast Period 2015-2020 – Valuates Reports -…

BANGALORE, India, Nov. 19, 2019 /PRNewswire/ -- Data protection requires a set of policies and controls that tackle the cloud's security aspects by protecting software, information, and infrastructure.

Due to its ability to configure the set of services as necessary, the cloud-based security services are expected to see increased market acceptance. The definition of managed security service protects against intruders & cyber attacks and includes firewall of the next generation, content filtering, managed two-factor authentication and even security consultancy. This is expected to offer industry players ample opportunities.

Inquire for Sample @ https://reports.valuates.com/request/sample/ALLI-Auto-1R41/Cloud_Security_Market

Growing adoption of cloud services by large and medium-sized companies and increased demand for managed security services generate ample opportunities for cloud security market players.

View Full Report @ https://reports.valuates.com/market-reports/ALLI-Auto-1R41/cloud-security-market

Trends Influencing The Cloud Security Market Share:

Region Wise Cloud Security Market Analysis:

Inquire for Regional/Country @ https://reports.valuates.com/request/regional/ALLI-Auto-1R41/Cloud_Security_Market

Growing Reliance On Cloud-based Services

Increasing adoption of cloud services across diverse verticals has resulted in increased dependence on cloud for storage and other applications. Increasing number of internet users and growing adoption of cloud services are the key impacting factors for adoption of cloud security solutions. Over the period, the growth of online business would highlight the significance of this factor.

Increasing Number Of Cyber-attacks

An increasing number of cyber-attacks due to an upsurge in digitalization is one of the driving factors of the cloud security market. Cyber-attacks have increased rapidly, thereby, resulting in a strong need for cloud security services.

The number of data theft cases have exponentially increased in the last five years, owing to increased generation of digital content and lack of security to protect financial and corporate data. BFSI, followed by IT & telecom and retail, are the most targeted industries.Therefore, burgeoning number of cyber-attacks and data breach cases would boost the growth of the market in the future.

Growing Market For Managed Security Services

The concept of managed security service offers protection against intruders & cyber-attacks and includes next-generation firewall, content filtering, managed two-factor authentication and even security consultancy. This is expected to provide ample opportunities for market players.

Cloud Security Key Segments

The market segmentation is illustrated below:

Cloud Security Market by Type

Cloud Security Market by End User

Cloud Security Market by Vertical

Cloud Security Market by Deployment

Cloud Security Market by Geography

Key Players

Key Benefits of Market Study:

Buy Report @ https://reports.valuates.com/api/directpaytoken?rcode=ALLI-Auto-1R41

Similar Reports :

Global Cloud Security Solutions Market : https://reports.valuates.com/market-reports/PROF-Auto-21U225/global-cloud-security-solutions-market

Global Cloud Security in Retail Market :

https://reports.valuates.com/market-reports/PROF-Auto-38A254/global-cloud-security-in-retail-market

Global Multi-Cloud Security Solutions Market :

https://reports.valuates.com/market-reports/QYRE-Auto-32W104/global-multi-cloud-security-solutions-market

Global Hybrid Cloud Security Solutions Market :

https://reports.valuates.com/market-reports/QYRE-Auto-8H435/global-hybrid-cloud-security-solutions-market

About Us:

Our aim is to collate unparalleled Market insights and notify our customers as and when it happens. Valuates is curating premium Market Research Reports from the leading publishers around the globe. We will help you map your information needs to our report repository of Market research reports and guide you through your purchasing decision. We are based out of Silicon Valley of India (Bengaluru) and provide 24/7 online and offline support to all our customers and just a phone call away.

Contact Us:

Valuates Reportssales@valuates.com For U.S. Toll Free Call +1-(315)-215-3225 For IST Call +91-8040957137 WhatsApp : +91-9945648335 Website:https://reports.valuates.com

SOURCE Valuates Reports

Here is the original post:
Global Cloud Security Market Size is Expected to Reach 8.9 Billion US$ with a CAGR of 23.5% During the Forecast Period 2015-2020 - Valuates Reports -...

Read More..

Vodafone picks Google Cloud to develop and host its global data platform – FierceTelecom

In its quest to be a digital operator, Vodafone Group has partnered up with Google Cloud to host its cloud platform for data analytics, business intelligence, and machine learning.

Vodafones Neuron big-data analytics platform, which "act as a brain and driver for AI and business intelligence" for Vodafones global business, will be hosted on Google Cloud.

Neuron serves as a single "data ocean" of analytic insights to support services and applications such as 5G optimization and smart retail. Neuron is currently being used across 11 countries where it combines data from more than 600 servers.

Like this story? Subscribe to FierceTelecom!

The Telecom industry is an ever-changing world where big ideas come along daily. Our subscribers rely on FierceTelecom as their must-read source for the latest news, analysis and data on the intersection of telecom and media. Sign up today to get telecom news and updates delivered to your inbox and read on the go.

Vodafone will also rely on Google Cloud Platform (GCP) for hybrid infrastructure and containerization and to develop its next-generation business intelligence platform. The container approach will deliver faster insights in a more standardized way, making it easier to compare performance across departments and local markets.

RELATED: Google Cloud pumps profits into parent company Alphabet's bottom line

Neuron will leverage Google Cloud to improve its operations by making Vodafones existing software cloud-compatible, which allows local markets to tap into new platform capabilities without disrupting existing campaigns.

"The project is complex and multi-faceted," Google Cloud CEO Thomas Kurian said in a blog post. "Vodafones existing on-premises group data platform is a shared service consisting of eight clusters with more than 600 servers and is used in 11 countries. The platform relies on legacy Hadoop architecture that lacks the agility or scalability to support demands for analytics and an increasing list of innovation projects.

"To begin, Vodafone will perform a large-scale migration of its global data into our highly secure public cloud. It will also create a custom platform for data performance that lets disparate data from across the organization be aggregated into one data oceanrather than multiple data lakeswithin which analytics and business intelligence can take place."

To simplify the integration, the Neuron platform will also use other Google Cloud services including Dataflow, Dataproc, Cloud Composer, Data Fusion, BigQuery, and Google Kubernetes Engine.

We want to lead the industry in capturing the benefits of digital," said Vodafone Group CTO Johan Wibergh, in a statement. "The capabilities that Google Cloud gives us will help accelerate our digital transformation.

Follow this link:
Vodafone picks Google Cloud to develop and host its global data platform - FierceTelecom

Read More..

Exposed database left terabyte of travelers’ data open to the public – CNET

One of the largest travel booking companies in Europe left its data exposed on an unprotected server, researchers say.

When it comes to travel, most people are concerned with planning their trip, getting the best price and making sure they've packed everything. Now they also need to worry about whether their reservation companies have properly secured their data: Security researchers found that one of Europe's largest hotel booking companies left more than a terabyte of sensitive data exposed on a public server.

The exposed database contained travelers' information like names, home addresses, lodging, children's personal information, credit card numbers and thousands of passwords stored in plaintext, the security researchers said Wednesday. The database stores information on 140,000 clients, each of which could be an individual, a group of travelers or an organization.

The database belongs to Gekko Group, a subsidiary of France-based AccorHotels, Europe's largest hospitality company. Gekko Group handles business travel and luxury travel with more than 600,000 hotels across the world, according to its website. AccorHotels referred to Gekko Group for comment.

Fabrice Perdoncini, Gekko Group's CEO, said that the company has secured the database and is launching an internal investigation on its IT systems.

"Ensuring the adequate protection of our clients' data is of utmost importance to Gekko Group, a B2B company," Perdoncini said in a statement. "We acknowledge the seriousness of this matter and confirm that no malicious use or misuse of data has been reported so far."

The company said that it was informing its affected clients and that less than 1,000 unencrypted credit card numbers were stored on the database. But more credit card numbers could have been seen in document scans stored on the server.

Now playing: Watch this: What to do if your personal information is part of a...

2:42

The pile of leaked passwords contained the credentials for the World Health Organization, and a potential hacker could have used those credentials to book travel using the group's budget, the security researchers said. The WHO didn't respond to a request for comment.

The discovery came via independent security researchers Noam Rotem and Ran Locar, who worked with Israeli security company VPNMentor to find the exposed database. "It's unfortunately not the first time we see a data breach of this scale with that type of sensitive information. It's sadly a much more common issue than one would think," Rotem said in a statement.

The researchers found the database, which is hosted onElasticsearch, through an online scan, while looking for servers that lacked proper protections.

"This breach represents a serious lapse in data security by Gekko Group and its subsidiaries, compromising the privacy of their customers, clients, AccorHotels, and the businesses themselves," VPNMentor said in a blog postWednesday.

As more companies move to store their data on cloud servers, they're driving cybersecurity concerns about properly protecting sensitive data. Security researchers have found volumes of sensitive data exposed online in unsecured databases as they look to warn companies to protect that data before a malicious hacker finds it.

In the past year, researchers found exposed databases showing debt from millions of people, along with open servers hosting millions of Facebook records. While security researchers found those first, hackers have also taken advantage of open servers. In July, a hacker allegedly stole the credit card applications of more than 100 million US citizens from Capital One's Amazon Web Services cloud server.

Rotem and Locar said they reported the exposed database to Gekko Group and AccorHotels on Nov. 7 and got a response on Nov. 13. The company told the researchers that it's since secured the server, according to Rotem and Locar.

Even if you've never interacted with those two companies, data from their partners was also exposed, the researchers said. The database had a significant amount of data from websites like Booking.com and Hotelbeds.com open to the public, including personal information and credit card numbers, researchers said.

Booking.com and Hotelbeds.com didn't respond to a request for comment.

VPNMentor's researchers also saw travel itineraries left on the open server, like tickets to Euro Disney and travel plans between hotels and airports with personal information.

The server was hosted in France, but the affected travelers came from several countries including Spain, the United Kingdom, the Netherlands, Portugal, France, Belgium, Italy and Israel, researchers said.

"For two companies of their respective sizes and market shares, Gekko Group and AccorHotels would be expected to have more robust data security," VPNMentor said. "By exposing such a huge amount of sensitive data, they will likely face questions over how this happened, and their wider data security policies for all brands they own."

Read the rest here:
Exposed database left terabyte of travelers' data open to the public - CNET

Read More..

Fashion retailer AllSaints on using Google Cloud to handle online shopping traffic peaks – ComputerWeekly.com

Adoption of the Google Cloud Platform in the retail space is continuing apace, with the fashion retailer AllSaints outlining how its use of the technology is assisting the brand with delivering a more robust and performant online shopping experience to its customers.

The firms website regularly receives two million visits a month, and the revenue generated by online sales of its garments and accessories is becoming increasingly important to the overall success of the company.

However, like many retailers, the firm has previously struggled with ensuring its website is equipped to cope with surprise and prolonged periods of peak traffic, and had taken to over-provisioning on-premise servers to protect against downtime or service disruptions.

Responsive websites and fast page-load speeds are critical for the mobile, connected customer, said John Bovill, executive consultant of digital and technology at AllSaints.

As such, the company operated a 60-unit strong server farm that had more than enough capacity to cope during peak periods, but when traffic levels returned to normal almost half of these would be left idle, which the company considered to be a waste of resources.

We make projections for rates of business in normal periods, but its very hard to predict how sales will rise during peak demand, especially online. Our need for infrastructure often doubles at those times, but we only actually need those servers for a very short period, he added.

At the same time, provisioning additional capacity was often a slow process, which sometimes caused delays in the companys plans to expand into new geographical territories or deploy new website features, for example.

To address these myriad issues, the firm decided to overhaul its infrastructure setup by moving to the Google Cloud Platform, which it claims has provided the firm with the ability to provision compute capacity instantaneously, across multiple locations, and importantly power it down when it is no longer required.

This was not, however, the firms first foray into the Google Cloud, having made the decision to make moving its applications and workloads to the cloud a strategic goal in 2014.

It started out by adopting Googles G-Suite portfolio of cloud-based productivity tools to bolster internal collaboration and communication between the firms employees as part of a introductory process to get its office and store-based staff used to using cloud-based tools.

This work also coincided with a switch in its software development strategy which has since seen it favour the use of microservices, and in 2016 the adoption of Kubernetes for container orchestration purposes.

The move to microservices was part of a push by the company to overhaul its continuous integration and continuous delivery pipeline, which is now based on Jenkins on Google Cloud and Terraform, so that its in-house developers can speed up the time it takes to release new features and code changes.

As a result, the company claims it has now cut the time it takes developers to deploy new code from 20 minutes to less than five, while enabling them to test code on the same infrastructure that it will run on in production.

Before, we couldnt confidently say a bug was fixed until we actually tested it in production. Now we can deploy code in test environments that exactly mimic production, said Andy Dean, technical operations manager at AllSaints.

The improved CI/CD pipeline means we can update our services every day, with a shorter lifespan on bugs, and minimal disruption. That makes us more responsive to customer needs, more proactive. And thats exactly what were trying to achieve.

The company had another cloud provider in its supplier mix at this time before deciding to oust them in favour of deepening its cloud ties with Google, on cost grounds, and embarking on a strategy that would see all new apps and services built and hosted solely in its cloud infrastructure.

At the same time, it also began laying the groundwork to begin migrating its legacy applications to the Google Cloud too.

We were moving 60 individual services, not just one application, said Dean. The interdependencies between them meant that it made more sense to move them all at once, and that took a lot of planning.

All in all, the migration was done and dusted within a week to avoid any network latency issues arising from running applications and workloads in two difference places at once for too long.

It was the biggest infrastructure change wed made in the history of the company, so one of our goals was that nobody notice the change, said Dean.

The companys technology department worked closely with the Google Cloud team during the migration to ensure the migration progressed as smoothly and efficiently as possible.

We got things stable within the first week, which was crucial for us. In that week, we moved 60 individual services, including our enterprise resource planning [ERP] tills, to a microservices cloud environment.

On the back of all this work, the retailer has been able to cut the number of servers it operates from 60 to 30, cutting its infrastructure costs by 50% in the process, by handing off a portion of its infrastructure requirements to the Google Computer Engine, with autoscaling support provided by the Google Kubernetes Engine.

We monitor the architecture using Stackdriver, but Google Kubernetes Engine really looks after itself, said Dean.

The self-healing aspect of Google Kubernetes Engine means we no longer have to make time to restart some of the [virtual machines]. Scaling is now automatic, and so are key maintenance tasks.

The company still has some cloud migration work to do, a process it predicts will stretch into 2021, as it prepares to containerise all of its back end and internal systems including its ERP and point-of-sale systems and move them to the cloud.

Strategically we are looking to maximise our usage of Google Cloud, driving this and associated technologies to provide the best possible AllSaints experience for our customers, said Bovill.

See original here:
Fashion retailer AllSaints on using Google Cloud to handle online shopping traffic peaks - ComputerWeekly.com

Read More..