Page 4,283«..1020..4,2824,2834,2844,285..4,2904,300..»

SEC may be Looking for Ways to Regulate the Cryptocurrency ICO Market – The Merkle

It was only a matter of time until regulators caught up with cryptocurrency ICOs. These coin offerings have gone unregulated for quite some time now, while raising millions in funding. Anyone buying into these token sales in the US is according to the law buying securities, which require special licenses from the company holding the ICO. With US regulators aiming to venture into the cryptocurrency ICO world, things could get quite interesting moving forward.

Various aspects of cryptocurrency will never be subject to traditional regulation. Virtually all of these projects are decentralized, with no single entity responsible for issuing the coins or controlling the majority of funds circulating on the network. The only entity regulators can go after are the third-party service providers found within the world of cryptocurrency. Wallet providers, exchanges, and investment schemes are bound to see more attention from US regulators moving forward.

Which brings us to cryptocurrency ICOs, the modern-day crowdfunding efforts without regulation or oversight. Everyone in the cryptocurrency world knows how ICOs are growing in popularity and seemingly raise more money than ever before. Projects raising over US$10m in funding are slowly becoming the norm rather than an exception right now. However, there are a lot of legal questions regarding the ICOs and how the tokens are distributed.

It is believed the SEC is currently taking a very close look at any cryptocurrency ICO on the agenda. This does not bode well for most of the projects out there, as very few of these teams have someone with the necessary legal knowledge on board. It is only normal US regulators want to pay close attention to what is going on in this regard, as ICOs can be seen as a way to launder money, in their opinion. A group of people raising millions of dollars overnight without regulation or oversight is suspicious, regardless of how you want to look at it.

The bigger question on peoples mind is whether or not they buy tokens or securities. According to the US legislation, a cryptocurrency token can quickly turn into a security, which causes all kinds of legal issues. If a security is created voluntarily or by accident it needs to be overseen and regulated by the SEC, regardless of its ties with cryptocurrency or otherwise. This confusion needs to be avoided at all costs, but for now, there are no clear regulatory guidelines whatsoever.

Rest assured it will not take all that long until the SEC will introduce some form of cryptocurrency ICO regulation moving forward. For now, it remains anybodys guess as to what we can expect from such a decision. If ICOs are put on the same level as IPOs, things will look very dire for cryptocurrency companies looking at this mechanism as a way to quickly secure funding. Although the SEC is apparently investigating this matter, it may take years until they come to a conclusion.

Moreover, there is the topic of trading these ICO tokens across cryptocurrency exchanges. A lot of tokens can be traded against fiat currencies, which can pose some new challenges as well once regulation materializes. For the time being, the cryptocurrency ICO sector has nothing to worry about just yet. However, this situation could change at any given moment, and a lot of teams will find themselves in an awkward position because of it.

If you liked this article, follow us on Twitter @themerklenews and make sure to subscribe to our newsletter to receive the latest bitcoin, cryptocurrency, and technology news.

See more here:
SEC may be Looking for Ways to Regulate the Cryptocurrency ICO Market - The Merkle

Read More..

Former People’s Bank of China Official to Give Cryptocurrency Lecture – CoinDesk

A former official of China's central bank is set to give a lecture on cryptocurrencies later this month.

Ping Xie wasthe first bureau chief of the People's Bank of China's Financial Stability Bureau. During his time at the PBoC, Xie worked inthe central bank'sNon-Banking Supervision Department andResearch Bureau, while also serving as the governor of its branch in Hunan Province, according to Bloomberg. He began his stint at the PBoC in 1985, departing in 2005.

Xie's lecture on cryptocurrencies is part of a series of nine lectures he plans to give on the topic of digitalfinance. The first lecture will beposted online on 23rd June, according to promotional materials.

It's a notable development for a long-time fixture in China's regulatory space who played a role in the economic reforms undertaken by China's government.

Beyond his work at the central bank, Xie served asgeneral manager ofCentral Huijin Investment Company, a state-owned company that manages state-owned assets and investments on behalf ofChina, where he focused on overseas investments.

Xie isalso the author ofthe book focused on digital finance entitled "Internet Finance in China: Introduction and Practical Approaches".

Image Credit:Southwestern University of Finance and Economics

The leader in blockchain news, CoinDesk is an independent media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. Have breaking news or a story tip to send to our journalists? Contact us at [emailprotected].

Read more from the original source:
Former People's Bank of China Official to Give Cryptocurrency Lecture - CoinDesk

Read More..

Bitcoin prices likely to continue wild ride – USA TODAY

SAN FRANCISCO What goes precipitously up, often comes crashing down to earth.

So it was with bitcoin on Thursday, when the price of the digital currency plunged19% its steepest drop in more than two years after a record run. The volatility remained on full display late Thursday and, as of Friday evening, bitcoin rebounded to$2,484.59.

The cryptocurrency, which flirted with $3,000 on Monday, sunk as low as $2,076.16 in intraday trading early Thursday amid a confluence of bad omens. Tech stocks have recently taken a thumping over concerns about their lofty valuations. Ominous reports from Goldman Sachs and Morgan Stanley suggested bitcoin was due for a reversal in price and required government regulation. The Federal Reserve hiked interest ratesWednesday.

Compounding worries, digital currency exchange Coinbase experienced an outage Monday because of high-trading volume. Another exchange, Bitfinex, on Tuesday said it was under DDOS attack.

Meanwhile, prices for digital currenciesripple andNEM declined the past week, though Ethereum, the second-largest currency, has soared 20% on speculation it will be the top currency. At $371.36, it lags far behind bitcoin in value.

CryptoCurrency Market Capitalizations

"Bitcoin and other digital currencies are experiencing rapid growth these days," says Guy Zyskind, CEO of Enigma,a start-up incryptocurrency investing."For this to be sustainable over time, the market has to correct itself from time to time."

The market's wild ride this week underscores"the ebbs and flows of an entirely new asset class," says Bill Barhydt, CEO of Abra, a peer-to-peer payment service.

"While the bitcoin price will likely recover and continue to rise, what we should see in the future is bitcoin becoming a solid store of value, much like gold," saysMihir Magudia, executive director of LEOcoin Foundation. "It will be relatively easy to liquidate but will not be used to commonly pay for goods and services."

Follow USA TODAY's San Francisco Bureau Chief Jon Swartz @jswartz on Twitter.

Read or Share this story: https://usat.ly/2tvQYOh

See the rest here:
Bitcoin prices likely to continue wild ride - USA TODAY

Read More..

Tintri Files for IPO as Flash-Based Cloud Storage Market Surges – SDxCentral

Cloud storage provider Tintrifiled plans for an initial public offering that is expected to raise $100 million. The filing comes on the heels of the company posting a $105.3 million loss during its latest fiscal year.

The Mountain View, California-based company earlier this month filed its IPO intent with the Securities and Exchange Commission. There has been no date or price range set for the IPO.

Tintri provides flash-based and hybrid-flash array storage targeted at enterprise cloud deployments.

The company has secured more than $262 million in private funding since its formation in 2008. Those investors have included New Enterprise Associates, Menlo Ventures, Lightspeed Venture Partners, Insight Venture Partners, and Silver Lake Kraftwerk.

As part of its IPO filing, Tintri reported recent financial performance highlighted by revenue gains alongside growing loses. The companys revenues surged from $50 million in 2015, to more than $125 million for fiscal 2017. However, net losses also grew from $70 million to $105.3 million over the same time period.

Venturedeal, which said its not financially connected with Tintri, claimed the IPO could be valued at around $100 million. However, the firm expressed concern with the companys slowing revenue growth and high cash burn.

The company is growing revenues at a rate that is typical of successful enterprise IT software firms at this stage, wrote Donovan Jones from Venturedeal.com via SeekingAlpha. However, the story is also typical in that growth rates are decelerating as the company exceeds $100 million in annual revenues. Additionally, [Tintri] is burning through large amounts of cash to achieve those diminishing growth rates, indicating a less-than-efficient sales model as the company scales.

Cloud storage startup Nasuni last year raised $25 million in funding, which pushed its total haul to $80.5 million. Competitor Panzura earlier this year raised $32 million in new funding, pushing its total investment raised to more than $80 million.

IDC reported the flash-based enterprise storage segment continues to post robust growth and is driving the overall enterprise storage market. During the first quarter, flash storage solutions generated $1.4 billion in revenues, which was a 75.7 percent year-over-year increase. The hybrid-flash array segment generated $2 billion in revenues accounting for 22 percent of the total enterprise storage market.

Spending on traditional external arrays continues to slowly shrink while spending on all-flash deployments once again posted strong growth and helped to drive the overall market, wrote Liz Conner, research manager for storage systems at IDC.

Dan Meyer is a Senior Editor at SDxCentral, with a focus on containers, lifecycle service orchestration, cloud automation and DevOps. Dan has been covering the telecommunications space for more than 17 years. Prior to SDxCentral, Dan was Editor-In-Chief at RCR Wireless News.

Read more here:
Tintri Files for IPO as Flash-Based Cloud Storage Market Surges - SDxCentral

Read More..

Dell EMC Hybrid Cloud System for Microsoft Review (Azure Pack) – StorageReview.com

June 16th, 2017 by StorageReview Enterprise Lab

The Dell EMC Hybrid Cloud System for Microsoftdebuted in late 2015as the first validated hybrid cloud system that implemented Microsoft Cloud Platform System (CPS) Standard. The Dell EMC Hybrid Cloud System for Microsoft combines PowerEdge hardware, Dell EMC Networking, and a software stack built with Windows Azure Pack and System Center 2012 R2. Dell EMC's vast engineering resources focused on creating a turnkey experience for new Dell EMC hybrid cloud administrators with a unified interface and a variety of licensing schemes. This was the company's second collaboration with Microsoft on Azure-based cloud systems. Prior to CPS Standard, the companies worked together to offer Microsoft CPS Premium, targeted at much larger deployments. The Dell EMC Cloud for Microsoft Azure Stack is on deck as the company's next Azure-based offering, expected to be available later this year.

The Cloud, and peoples opinion of it, has gone through quite the evolution in the last few years. At first it was seen as a source of bulk cheap storage that wasnt safe. As time went on, the security concerns began to fade. Organizations werent just using the Cloud as a source of bulk storage or a replication target,they soon began to host several of their applications in the Cloud. Now there are hundreds of organizations that are either cloud-first (the company begins and remains mainly in the cloud) or cloud-centric (the company still has on-prem gear but runs a majority of its business through the cloud). Of the three types of clouds (private, public, and hybrid), the largest growing seems to be the hybrid version. Capitalizing on this, Dell EMC continues to workwith Microsoft to deliver an on-prem hybrid cloud for Microsoft shops. Additionally, Dell EMC feels they offer an incredible amount of value to customers deploying these large, and many times complex, solutions by finding bugs and sorting out patch issues well before a customer gets their hands on it. That way, when Microsoft's Patch Tuesday comes around, Dell EMC's Hybrid Cloud Team makes the update process painless for their end customer. Dell EMC also notes value from integrations with value-added services like backup and encryption, along with its one-call support for the complete hardware-software stack and ongoing life-cycle management.

Dell EMC provided us remote access to a Hybrid Cloud System for Microsoft that was hosted at the Dell Customer Solution Center in Austin, Texas. Dell EMC's Hybrid Cloud System is built on PowerEdge C6320 servers with Intel Xeon E5-2600 V3 processors that host up to 400 virtual machines. PowerEdge R730 servers provide file server functionality, while Dell EMCMD1400/1420 DAS arrays are configured with between 32TB and 128TB of raw storage space. For network connectivity, Dell EMC Networking S4048 10G switches are leveraged.

The DHCS comes with some configurability for users that want more or less of certain aspects. For example, the minimum configuration is one S4048 switch for networking, one PowerEdge C6320 server for compute, and a cluster of four PowerEdge R730 servers for storage. The minimum configuration comes with no backup Data Protection Manager (DPM) servers. On the flip side, if users need the maximum of everything, the DHCS can be configured with two S4048 switches for redundancy, four PowerEdge C6320 servers (16 sleds for compute), three backup DPM servers, and sixPowerEdge R730 servers (two storage hosts and four storage enclosures, all accessible to the compute nodes).

The overall value proposition of a hybrid cloud hinges on making it straightforward to granularly allocate resources to the localprivate cloud, as well as offsite-public and private-cloud hosts. An integrated hybrid cloud environment could succeed or fail based on whether it creates a consistent user and administrative experience across all of the resources it manages. At this point in the evolution of cloud services, it is also vital that cloud infrastructure integrates seamlessly with backup and disaster recovery services.

On the operating system and software end, Dell EMC's implementation is based on Windows Server 2012 R2 with System Center 2012 R2 and the Windows Azure Pack. Azure is the center of the user experience, as well as the interface for most administrative tasks. During our testing for the review we had access to Azure Backup, Azure Site Recovery, and Azure Operational Insights. Dell's PowerEdge management system includes OpenManage Integration for System Center and Dell iDRAC 7 with LifeCycle Controller.

Dell Hybrid Cloud System for Microsoft Specifications

Management

Our review focuses on the experience of using the Windows Azure Pack for management, although System Center 2012 R2 is also available for "traditional" Windows administration workflows. While we were working with Azure, we wanted to be sure to experience the process of deploying infrastructure as a service, database as a service, and Azure's disaster recovery functions.

The Azure "tenant portal" is the center of the administrative experience. After selecting Azure Pack tenant, one simply needs to log in and the userwill be able to easily provision a new VM.

Once logged in, users see everything that has been created by the user that logged in, includingVMs, gallery items, and databases. The left-hand side shows resource providers that are part of the plan. For new deployments, users need to click on the +New tab in the bottom-left corner.

After choosing new, a pop-up screen comes up with a variety of options. Here we select "Standalone Virtual Machine" from the options listed.

After selecting Standalone Virtual Machine, we are brought to the Standalone Virtual Machine gallery. Given a variety of choices, we are going with A2_Full. To the right of the selection is info about the VM.

Once users select the VM they want, they will be prompted to enter a name, their username and password, as well as a product key. After that, users need to indicate where the VM will be deployed. Once this is complete, users need only to click the checkmark in the bottom right-hand corner to finish the deployment.

Next, we are looking at how to provision a Database as a Service (DBaaS). Going back to the Service Management portal, users simply choose MySQL Database on this go round.

Once the Database is selected, users are prompted with a window asking to name the database and the edition. This is followed by a second window asking for credentials.

After the database is created, users again go back to the Service Management Portal. The DB1 database created a moment ago with some general information appears. For the next step, we will need the server name,which can be found under the info tab at the bottom. Once we have that, we click on +New to provision the VM role.

On the top of the left-hand side is the option for creating the VM Role.

Here wewill be creating a WordPress instance to the database and then we will scale it out to multiple instances. Here we select WordPressExtDB and hit the arrow on the bottom right. The wizard will prompt us to name the WordPress Database (WP01 in this case) and the version. The wizard will continue to step through actions such as creating compute name pattern, time zone, root account credentials, DNS domain name, and SSH key.

The last step uses the information from the previous steps to finish setting up the WordPress instance. Hitting the checkmark will deploy it.

Once the WordPress instance is deployed, users can set up an account or create a web site on the web front end of WordPress. AWordPress account needs to be set up before the next step. Once the account has been set up, users can go back to the tenant portal of Azure and click on the WordPress instance to define its role.

Within the role of the WordPress instance, simply select "Scale" and move the slider to the desired amount of instance needed.

Finally, we will be configuring Azure as a recovery point for disaster recovery. The DHCS comes configured to use Azure Site Recover, which means fewersteps for users. This feature is easy enough;as opposed to showing a step-by-step, it can simply be summed up in a few sentences. Users need to subscribe to a plan or add-on that has VM protection enabled. Users then create a virtual network as to how the VM will connect for failover. And then the user creates a VM for the failover.

Azure Operational Insights aggregates log data across platforms, operating systems, and coulds to provide enterprise-wide analysis.

The Dell EMC Hybrid Cloud Team feels their biggest value-add is simplifying the process for users to make updates and patches. An automated patch and update system for firmware, BIOS, drivers, and software is offered that is designed to be non-disruptive. The update framework includes intelligent dependency analysis that tests and packages patches and updates before deployment. This makes sure users spend their time managing their own needs, versus spending time makingsure updates don't break existing functionality. Moreover, Dell EMC says that it is typical to have the hybrid cloud system operational in less than three hours.

Conclusion

The Azure-centric services deployment model is a change from traditional Microsoft server administration, but one that felt intuitive and polished during our time working with the Dell EMC Hybrid Cloud System. Intuitive user experience is growing more important as the number of applications and services required for business continues to increase in most sectors.

As a cloud-native management environment, Azure has been built from the ground up with the expectation that users and administrators may need to be able to provision services and storage with granular control over whether data and compute will be hosted in the private or public cloud. Azure's backup and disaster recovery functionality provides the means to implement a variety of common configurations in both regards, while at the same time simplifying the process to do so.

Combining years of Microsoft collaboration with the EMC heritage of turnkey hybrid cloud systems (Enterprise Hybrid Cloud and Native Hybrid Cloud), Dell EMC is now focusing its Azure-based solutions efforts on the forthcoming Dell EMC Cloud for Microsoft Azure Stack. Announced just ahead of Dell EMC World 2017 in May, this solution will combine all of Dell EMCs past experience into a similar, yet new Microsoft-based hybrid cloud offering. Taking many of these concepts further, Dell EMC notes that its Azure Stack-based hybrid cloud system will move the experience from a disaggregated storage model (from DHCS) to a true hyper-converged model (APIs will also allow users to write once and run applications on any Azure cloud). Dell EMC says its long history with Microsoft, combined with its turnkey hybrid cloud platforms experience will give them a leg up as Azure Stack hits the market later this year. We look forward to the release of the Dell EMC Cloud for Microsoft Azure Stack and the opportunity to conduct a closer examination and review.

Dell EMC Hybrid Cloud

Discuss This Review

See more here:
Dell EMC Hybrid Cloud System for Microsoft Review (Azure Pack) - StorageReview.com

Read More..

XSS Just One Part of Broad Application Threat Landscape: Report – Web Host Industry Review

Only one out of 1,000cross-site scripting attacks (.001 percent) progress and require a security response, according to research released Tuesday by application security company tCell.

The State of In-Production Application Security report, drawn from analysis of more than 30 major enterprise applications in production, shows that over 40 percent of organizations experience account takeover attacks unrelated to software flaws over just a 30-day period. These attacks typically leverage large credential breaches, and 85 percent of them successfully compromise a user.

More than 90 percent of organizations have orphan application routes, or API endpoints which have been forgotten and left open. More than a quarter of companies have over 100 such vulnerabilities, which represent an attack surface with no business benefit, according to tCell.

Hosting Firm Restores Servers After Ex-Employee Deletes Everything

Many enterprise organizations start out thinking they have to replicate the traditional data center security stack for cloud environments, Michael Feiertag, tCell CEO said in a statement. The reality is that its a different, far more dynamic world, with a lot of effort from the cloud provider on securing that infrastructure. Organizations need to focus on protecting whats theirs, the application, which enables all of the goodness that is cloud without weighing it down.

The report findings and insights about securing production applications gathered by tCell since it began broad customer deployments last year underscore the variety of application attack vectors and types, which the company says go beyond the OWASP Top 10.

Along with the report, tCell announced expanded product functionality and platform support. The company now supports enterprise .NET applications, and its latest release adds point-of-attack instrumentation to determine if command injection attempts have breached the app, and field-level encryption for increased data security in regulated industries like healthcare and financial services.

See more here:
XSS Just One Part of Broad Application Threat Landscape: Report - Web Host Industry Review

Read More..

Get ahead in quantum computing AND attract Goldman Sachs – eFinancialCareers

http://news.efinancialcareers.com/uk-en/285249/machine-learning-and-big-data-j-p-morgan/

40 years ago a personal computer cost around $500k in todays money and was accessible only to large corporations. Today, as the clich goes, that kind of processing power is available to most people in an affordable mobile phone. Quantum computing, however, is a different matter. Quantum computers are stuck in the 1950s: there arent many of them, they cost tens of millions of dollars, and they take up entire rooms.

One of todays very rare and very costly quantum machines is being developed by D-Wave Systems Inc., a company whose CEO happens to be Vern Brownell, a former CTO of Goldman Sachs. Goldman is one of several lead investors in D-Wave, which its described as having a head start in the field. While most quantum computing rivals are still in their infancy, D-Wave has already been using its system for machine learning. Competitors are eyeing the same plot: 1QB Information Technology Systems Inc (1QBit), a Vancouver-based quantum computing, counts derivatives exchange CME Group among its investors. An RBS banker who led 1QBits 2015 finance round toldBloombergquantum computing is perfect for the data-rich time-sensitive world of financial markets.

Interestingly, therefore, an opportunity has arisen to write machine learning algorithms for quantum computers and then implement them using D-Wave 2000Q, the companys first commercially available quantum computer. Training on the system will be made available too.

The quantum machine learning program is being run by the Creative Destruction Lab (CDL), a seed funding program for science-based companies based in Toronto. Last month, it invited applications for 40 places on an initiative intended to develop and sponsor a wave of quantum machine learning start-ups. The next (and last) round of applications closes on Monday July 24th.

Daniel Mulet, associate director of machine learning at CDL says theyve already received 42 applications, around 10% of which are biased towards financial services. Some are very early stage and have been submitted by students, but others are companies that have already been launched, says Mulet. - Theres one thats working with a hedge fund looking for patterns with trading data.

Traditional computers use binary code to solve problems: a bit can be a 1 or a 0. Quantum computers use qubits: a bit can be a 1 or a 0 or a 1 AND a 0 As Bloomberg points out, therefore, if you have two qubits you can have four potential states: 00, 01, 10, and 11. Moreover, the number of states a quantum computer can take into consideration is2 raised to the power of the number of qubits: if you had a 50-qubit universal quantum computer, you could explore1.125 quadrillion states simultaneously.

Quantum computers are able to process much larger quantities of data much faster, says Mulet. Its our belief that these new quantum hardware platforms built by D-Wave or IQB will be used for various machine learning applications in the next few years. When that happens, we want to be ready to leverage that. One day all Bloomberg terminals will be run on quantum computers.

Its not hard to see why Goldman is interested.

If youre interested too and want to apply, you have 39 days to polish your application. As a further lure to candidates, those selected will be mentored by the likes of William Tunstall-Pedoe, a Cambridge AI entrepreneur, and Barney Pell, chief strategy officer at San Francisco-based Loco-Mobi (which is applying AI to parking your car).Those graduating from the program, which begins in September, will receive $80k in funding in return for 8% of the equity in their company.

Mulet says ideal applicants will have a Masters or PhD in a quantitative subject, and be proficient in programming in Python and the use of Tensor Flow, Googles open source library for machine learning.

Contact: sbutcher@efinancialcareers.com

Photo credit:Quantum foambyAlex Sukontsevis licensed under CC BY 2.0.

Read this article:
Get ahead in quantum computing AND attract Goldman Sachs - eFinancialCareers

Read More..

KPN CISO details Quantum computing attack dangers – Mobile World Live

EXCLUSIVE INTERVIEW: Quantum computing will present a very real threat in the next ten years and operators will have to rethink how they handle their data privacy and security, KPN chief information security officer (CISO) Jaya Baloo (pictured) told Mobile World Live.

When there is a viable quantum computer it will change the way we handle the current mechanism to protect our data secrecy which is cryptography, she explained, adding operators will have to rethink every type of cryptography they use and design new algorithms capable of resisting a quantum computing attack.

When it comes to operators offering personalised services, she said it is not possible to be 100 per cent privacy preserving while offering customised services.

However, operators should willingly and knowingly and very transparently inform customers about what they are doing with user data and how they maintain or securely delete that information.

Thats more important than the technology behind it having that dialogue is the most fundamental thing we can do.

She also shed light on the security implications of IoT and how KPN views the EU General Data Protection Regulation as much more in line with our current way of working rather than a burden.

Click here to watch the full interview.

Read the original here:
KPN CISO details Quantum computing attack dangers - Mobile World Live

Read More..

May and Macron’s internet security plan will make us ‘less safe’ – The National

HEADTEACHERS are to be given more control over learning and teaching as part of a package of sweeping new powers changing how Scotlands schools are run, Education Secretary John Swinney announced yesterday.

In return for having more power to choose staff and management structures, more choice on the curriculum and more control of funding, the heads will become responsible for raising attainment and closing the gap between the poorest and richest pupils in their schools.

Swinney told MSPs the reforms were all based on a simple plan.

We will free our teachers to teach. We will put new powers in the hands of our headteachers. We will ensure that parents, families and communities play a bigger role in school life and in their childrens learning.

Though welcoming parts of the proposals, both Labour and the Greens said the government needed to do more to recruit and keep teachers.

The Deputy First Ministers reforms were the result of a review launched last September.

Changes will be included in a new Education Governance Bill, to be introduced next year.

Swinney said the revamping of how schools were run would put the power to directly change lives into the hands of those with the expertise and insight to target resources at the greatest need.

He added: The evidence is clear that the strength and quality of leadership in our schools is crucial to delivering improvement.

We know that headteachers want to focus on delivery of learning and teaching, not be chief administrator of their school.

We will, therefore, give headteachers more power over decisions on learning and teaching, freeing them to make a difference to the lives of children and young people.

Perhaps the biggest change will be money from central government bypassing councils and going straight to headteachers.

Swinney said he had already ruled out a fixed national funding formula to work out what schools should get.

Instead, a consultation on fair funding has been launched to consider how to distribute the cash.

There would also be more parental involvement in the running of schools, with this being underpinned by legislation. Pupil participation would also be made more effective and consistent.

One direct lift from the London Challenge scheme, which saw a huge turnaround in the results of secondary schools in the capital, will include new regional improvement collaboratives where teachers moving from one school in an area to another as heads share best practice and pool and strengthen resources.

Swinney told MSPs: Improving the education and life chances of our children and young people is the defining mission of this government. While there are many strengths in Scottish education, recent Pisa and literacy scores underline that we can, and we must, achieve more.

Other measures contained in the reforms include new routes into the profession, and changes to teacher training courses.

Proposals to establish an Education Workforce Council for Scotland, bringing the General Teaching Council for Scotland together with other training bodies, have also been put forward.

However, the minister rejected a bid by a group of parents who want to see St Josephs Catholic primary school in Glasgow removed completely from the control of the local authority.

The reforms I am setting out today will significantly increase the autonomy of our schools, the role of parents in school life and ensure our schools are rooted in their communities, he said.

Crucially, however, the reforms deliver this within a clear national and local framework of policy and support.

I therefore cannot agree to pursue the specific proposals from parents at St Josephs and elsewhere as they would remove schools from that crucial support structure.

Conservative education spokesman Liz Smith welcomed greater devolution of power to teachers but said more radical reform is needed.

She said: We do not believe these reforms go far enough, particularly when it comes to extending choice and allowing schools to opt out of local authority rule if thats what parents and teachers want.

Labours Iain Gray welcomed the proposals but said Swinney needed to put more money towards teaching.

Consultation responses to the governance review from teachers, from parents, from educationalists and from councils all said the same thing that the first reform we need is more teachers, properly paid, properly supported and properly resourced.

Ross Greer agreed, saying the reforms would notresolve the key issue in Scottish education of staff shortages.

Go here to see the original:
May and Macron's internet security plan will make us 'less safe' - The National

Read More..

Encryption Definition | Investopedia

DEFINITION of 'Encryption'

Encryption is a means of securing data using a password (key). The encryption process is simple data is secured by translating information using an algorithm and a binary key. When the data needs to be read back, the code is decrypted using either the same key or a different key depending on the type of encryption used.

Encryption strength is based on the length of the security key. In the latter quarter of the 20th century, 40 bit encryption, which is a key with 240 possible permutations, and 56 bit encryption was standard. Those keys were breakable through brute force attacks by the end of the century, and the 128 bit system became standard in web browsers. The Advanced Encryption Standard (AES) is a protocol for data encryption created in 2001 by the U.S. National Institute of Standards and Technology. AES uses a 128 bit block size, but key lengths of 128, 192 and 256 bits. AES uses a symmetric-key algorithm, meaning the same key is used for both encrypting and decrypting the data. 128-bit encryption is standard but most banks; militaries and governments use 256-bit encryption.

See the original post:
Encryption Definition | Investopedia

Read More..