Page 3,139«..1020..3,1383,1393,1403,141..3,1503,160..»

Parascript and SFORCE Partner to Leverage Machine Learning Eliminating Barriers to Automation – GlobeNewswire

Longmont, CO, Feb. 09, 2021 (GLOBE NEWSWIRE) -- Parascript, which provides document analysis software processing for over 100 billion documents each year, announced today the Smart-Force (SFORCE) and Parascript partnership to provide a digital workforce that augments operations by combining cognitive Robotic Process Automation (RPA) technology with customers current investments for high scalability, improved accuracy and an enhanced customer experience in Mexico and across Latin America.

Partnering with Smart-Force means we get to help solve some of the greatest digital transformation challenges in Intelligent Document Processing instead of just the low-hanging fruit. Smart-Force is forward-thinking and committed to futureproofing their customers processes, even with hard-to-automate, unstructured documents where the application of techniques such as NLP is often required, said Greg Council, Vice President of Marketing and Product Management at Parascript. Smart-Force leverages bots to genuinely collaborate with staff so that the staff no longer have to spend all their time on finding information, and performing data entry and verification, even for the most complex multi-page documents that you see in lending and insurance.

Smart-Force specializes in digital transformation by identifying processes in need of automation and implementing RPA to improve those processes so that they run faster without errors. SFORCE routinely enables increased productivity, improves customer satisfaction, and improves staff morale through leveraging the technology of Automation Anywhere, Inc., a leader in RPA, and now Parascript Intelligent Document Processing.

As intelligent automation technology becomes more ubiquitous, it has created opportunities for organizations to ignite their staff towards new ways of working freeing up time from the manual tasks to focus on creative, strategic projects, what humans are meant to do, said Griffin Pickard, Director of Technology Alliance Program at Automation Anywhere. By creating an alliance with Parascript and Smart-Force, we have enabled customers to advance their automation strategy by leveraging ML and accelerate end-to-end business processes.

Our focus at SFORCE is on RPA with Machine Learning to transform how customers are doing things. We dont replace; we compliment the technology investments of our customers to improve how they are working, said Alejandro Castrejn, Founder of SFORCE. We make processes faster, more efficient and augment their staff capabilities. In terms of RPA processes that focus on complex document-based information, we havent seen anything approach what Parascript can do.

We found that Parascript does a lot more than other IDP providers. Our customers need a point-to-point RPA solution. Where Parascript software becomes essential is in extracting and verifying data from complex documents such as legal contracts. Manual data entry and review produces a lot of errors and takes time, said Barbara Mair, Partner at SFORCE. Using Parascript software, we can significantly accelerate contract execution, customer onboarding and many other processes without introducing errors.

The ability to process simple to very complex documents such as unstructured contracts and policies within RPA leveraging FormXtra.AI represents real opportunities for digital transformation across the enterprise. FormXtra.AI and its Smart Learning allow for easy configuration, and by training the systems on client-specific data, the automation is rapidly deployed with the ability to adapt to new information introduced in dynamic production environments.

About SFORCE, S.A. de C.V.

SFORCE offers services that allow customers to adopt digital transformation at whatever pace the organization needs. SFORCE is dedicated to helping customers get the most out of their existing investments in technology. SFORCE provides point-to-point solutions that combine existing technologies with next generation technology, which allows customers to transform operations, dramatically increase efficiency as well as automate manual tasks that are rote and error-prone, so that staff can focus on high-value activities that significantly increase revenue. From exploring process automation to planning a disruptive change that ensures high levels of automation, our team of specialists helps design and implement the automation of processes for digital transformation. Visit SFORCE.

About Parascript

Parascript software, driven by data science and powered by machine learning, configures and optimizes itself to automate simple and complex document-oriented tasks such as document classification, document separation and data entry for payments, lending and AP/AR processes. Every year, over 100 billion documents involved in banking, insurance, and government are processed by Parascript software. Parascript offers its technology both as software products and as software-enabled services to our partners. Visit Parascript.

Read more here:
Parascript and SFORCE Partner to Leverage Machine Learning Eliminating Barriers to Automation - GlobeNewswire

Read More..

How Blockchain and Machine Learning Impact on education system – ABCmoney.co.uk

Over the years, digital transformation has modified the way people and organizations function. While the researches are carried out to find ways of integrating technology into the traditional sectors of a country, some noteworthy technologies have surfaced.

Amongst them are blockchain and machine learning.

What are blockchain and machine learning?

Blockchains an immutable ledger that aids in maintaining the records of transactions and tracking assets in an organization.

As for the assets, they can be tangible or intangible. Additionally, the transaction may refer to cash inflows and outflows.

Blockchain is playing a significant role in many organizations due to several reasons.

With this latest technology, anything can be traded and track, which minimizes the risk and cut the costs. As a result, a business can employ fewer accountants and efficiently manage their accounts with minimal to zero errors.

Secondly, blockchain management helps track orders, production processes, and payments that are to be made to the business itself or others.

Lastly, blockchain stores information with great secrecy, which gives more confidence and a sense of security to the business. Therefore, a business can significantly benefit from the increased efficiency, which may lead to economies of scale. As a result, decreased average costs will provide the business with more opportunities.

On the other hand, machine learning is a type of Artificial Intelligence that allows the system to learn from the data and not through explicit learning. Nonetheless, it is not a simple procedure.

Furthermore, a machine-learning model is an outcome generated through the training of your machine-learning algorithm. Therefore, you will receive an output after providing an input after the machine is trained.

There are various approaches to machine-learning which are based on the volume and kind of data.

These approaches include supervised learning, unsupervised learning, reinforcement learning, and deep learning.

If you are a Researcher or student want to write and dissertation or thesis on Blockchain, Artificial intelligence, you can visit Researchprospect.com and find Blockchain andArtificial Intelligence Topics for Dissertation.

Impact of blockchain on education system

Since the functioning, if organizations have modified due to newfound technology, this will directly impact the education system in many ways.

Maintaining student records

Academic records are one of the most demanding documents to maintain. Labor-intensive tasks such as these consume more time leading to inefficiencies and a greater risk of mistakes. However, blockchain technology ensures accuracy and efficiency.

Moreover, certification of students who are enrolled in a course is another tedious task. It becomes even more challenging at the university level to compare the coursework of students and know their credibility. Manually, the information shall be stamped a designed for authentication.

However, with blockchain, a person can gain access to the verified record of a students academic course and achievements.

Issuance of certificates

Imagine how tiring it would be to print gazillions and gazillions of certificates, sign them off and award them. Though this has been happening for years, it is undoubtedly a challenging task.

Therefore, blockchain has brought much ease. A students certificates, diplomas, and degrees can be stored and issued with just a few clicks.

In this way, the employers will only need a link to access the diploma, unlike viewing a paper copy of certificates.

This is not only eco-friendly, but it will prevent students from submitting fake diplomas and certificates.

Aside from diplomas and degrees, a resume has other elements that an employer might look at. This includes foreign languages, special abilities, technical knowledge, and extracurricular. However, a person will need verification to prove they learned this skill over time.

This authentication comes from the badges and certificates. Therefore, if you store these on the blockchain, it will verify the existence of your skills conveniently.

Impact of machine learning on education system

Learning analytics

Machine-learning can aid the teachers in gaining access to data that is complex yet important. Computers can help the teachers to perform tasks. As a result, the teachers can derive conclusions that positively affect the learning process.

Predictive analytics

Furthermore, machine learning can help analyze and derive conclusions about situations that can happen in the future. If a teacher wants to use the data of school students, they do so within minutes. Also, blockchain can help the admin know if a student fails to achieve a certain level. Aside from this, predictive analytics can predict the students future grade to provide a direction to the teachers.

Adaptive learning

Adaptive learning is a tech-based education system that elaborates a students performance and modifies learning methods.

Therefore, machine learning can aid struggling students or students with different abilities.

Personalized learning

On the other hand, personalized learning is an education system that guides every student according to their capability.

Henceforth, the students can pick out their interests through machine-learning, and teachers can fit the curriculum according to it.

Improved efficiency

Machine learning can make the education system more efficient by providing detailed analysis, completing work related to classroom management. The teacher can efficiently manage databases to maintain records and plan out the schedule for the coming weeks.

If they want, they can refer to it whenever. Therefore, machine learning will not only save up the teachers energy but their time as well.

Assessments

Did you imagine artificial intelligence could test students? Machine learning can be used to grade students assignments and assessments alongside exams.

Though assessing students through machine-learning might require some human effort, it will surely provide extraordinarily valid and reliable results.

Teachers can feel confident if the grades accuracy while students can be sure that grades have been awarded on equal merit and fairly.

Conclusively, technological advancement has dramatically revolutionized the educational sector of countries. In the coming years, block chain and machine learning will continue to impact the education system positively. However, it comes with inevitable repercussions as well. Rapid capital-intensity means that manual workers will no longer be needed to perform various functions. Henceforth, it will cause massive unemployment sooner or later. As a result, the government might face difficulties in retaining the right economic conditions. Lastly, automation such as blockchain and machine-learning are costly procedures that may not be affordable for every institute.

Original post:
How Blockchain and Machine Learning Impact on education system - ABCmoney.co.uk

Read More..

If you know nothing about deep learning with Python, start here – TechTalks

This article is part ofAI education, a series of posts that review and explore educational content on data science and machine learning. (In partnership withPaperspace)

Teaching yourself deep learning is a long and arduous process. You need a strong background in linear algebra and calculus, good Python programming skills, and a solid grasp of data science, machine learning, and data engineering. Even then, it can take more than a year of study and practice before you reach the point where you can start applying deep learning to real-world problems and possibly land a job as a deep learning engineer.

Knowing where to start, however, can help a lot in softening the learning curve. If I had to learn deep learning with Python all over again, I would start with Grokking Deep Learning, written by Andrew Trask. Most books on deep learning require a basic knowledge of machine learning concepts and algorithms. Trasks book teaches you the fundamentals of deep learning without any prerequisites aside from basic math and programming skills.

The book wont make you a deep learning wizard (and it doesnt make such claims), but it will set you on a path that will make it much easier to learn from more advanced books and courses.

Most deep learning books are based on one of several popular Python libraries such as TensorFlow, PyTorch, or Keras. In contrast, Grokking Deep Learning teaches you deep learning by building everything from scratch, line by line.

You start with developing a single artificial neuron, the most basic element of deep learning. Trask takes you through the basics of linear transformations, the main computation done by an artificial neuron. You then implement the artificial neuron in plain Python code, without using any special libraries.

This is not the most efficient way to do deep learning, because Python has many libraries that take advantage of your computers graphics card and parallel processing power of your CPU to speed up computations. But writing everything in vanilla Python is excellent for learning the ins and outs of deep learning.

In Grokking Deep Learning, your first artificial neuron will take a single input, multiply it by a random weight, and make a prediction. Youll then measure the prediction error and apply gradient descent to tune the neurons weight in the right direction. With a single neuron, single input, and single output, understanding and implementing the concept becomes very easy. Youll gradually add more complexity to your models, using multiple input dimensions, predicting multiple outputs, applying batch learning, adjusting learning rates, and more.

And youll implement every new concept by gradually adding and changing bits of Python code youve written in previous chapters, gradually creating a roster of functions for making predictions, calculating errors, applying corrections, and more. As you move from scalar to vector computations, youll shift from vanilla Python operations to Numpy, a library that is especially good at parallel computing and is very popular among the machine learning and deep learning community.

With the basic building blocks of artificial neurons under your belt, youll start creating deep neural networks, which is basically what you get when you stack several layers of artificial neurons on top of each other.

As you create deep neural networks, youll learn about activation functions and apply them to break the linearity of the stacked layers and create classification outputs. Again, youll implement everything yourself with the help of Numpy functions. Youll also learn to compute gradients and propagate errors through layers to spread corrections across different neurons.

As you get more comfortable with the basics of deep learning, youll get to learn and implement more advanced concepts. The book features some popular regularization techniques such as early stopping and dropout. Youll also get to craft your own version of convolutional neural networks (CNN) and recurrent neural networks (RNN).

By the end of the book, youll pack everything into a complete Python deep learning library, creating your own class hierarchy of layers, activation functions, and neural network architectures (youll need object-oriented programming skills for this part). If youve already worked with other Python libraries such as Keras and PyTorch, youll find the final architecture to be quite familiar. If you havent, youll have a much easier time getting comfortable with those libraries in the future.

And throughout the book, Trask reminds you that practice makes perfect; he encourages you to code your own neural networks by heart without copy-pasting anything.

Not everything about Grokking Deep Learning is perfect. In a previous post, I said that one of the main things that defines a good book is the code repository. And in this area, Trask could have done a much better job.

The GitHub repository of Grokking Deep Learning is rich with Jupyter Notebook files for every chapter. Jupyter Notebook is an excellent tool for learning Python machine learning and deep learning. However, the strength of Jupyter is in breaking down code into several small cells that you can execute and test independently. Some of Grokking Deep Learnings notebooks are composed of very large cells with big chunks of uncommented code.

This becomes especially problematic in the later chapters, where the code becomes longer and more complex, and finding your way in the notebooks becomes very tedious. As a matter of principle, the code for educational material should be broken down into small cells and contain comments in key areas.

Also, Trask has written the code in Python 2.7. While he has made sure that the code also works smoothly in Python 3, it contains old coding techniques that have become deprecated among Python developers (such as using the for i in range(len(array)) paradigm to iterate over an array).

Trask has done a great job of putting together a book that can serve both newbies and experienced Python deep learning developers who want to fill the gaps in their knowledge.

But as Tywin Lannister says (and every engineer will agree), Theres a tool for every task, and a task for every tool. Deep learning isnt a magic wand that can solve every AI problem. In fact, for many problems, simpler machine learning algorithms such as linear regression and decision trees will perform as well as deep learning, while for others, rule-based techniques such as regular expressions and a couple of if-else clauses will outperform both.

The point is, youll need a full arsenal of tools and techniques to solve AI problems. Hopefully, Grokking Deep Learning will help get you started on the path to acquiring those tools.

Where do you go from here? I would certainly suggest picking up an in-depth book on Python deep learning such as Deep Learning With PyTorch or Deep Learning With Python. You should also deepen your knowledge of other machine learning algorithms and techniques. Two of my favorite books are Hands-on Machine Learning and Python Machine Learning.

You can also pick up a lot of knowledge browsing machine learning and deep learning forums such as the r/MachineLearning and r/deeplearning subreddits, the AI and deep learning Facebook group, or by following AI researchers on Twitter.

The AI universe is vast and quickly expanding, and there is a lot to learn. If this is your first book on deep learning, then this is the beginning of an amazing journey.

Subscribe to get the latest updates from TechTalks:

More here:
If you know nothing about deep learning with Python, start here - TechTalks

Read More..

Immunai raises $60M as it expands from improving immune therapies to discovering new ones, too – TechCrunch

Just three years after its founding, biotech startup Immunai has raised $60 million in Series A funding, bringing its total raised to over $80 million. Despite its youth, Immunai has already established the largest database in the world for single cell immunity characteristics, and it has already used its machine learning-powered immunity analysts platform to enhance the performance of existing immunotherapies. Aided by this new funding, its now ready to expand into the development of entirely new therapies based on the strength and breadth of its data and ML.

Immunais approach to developing new insights around the human immune system uses a multiomic approach essentially layering analysis of different types of biological data, including a cells genome, microbiome, epigenome (a genomes chemical instruction set) and more. The startups unique edge is in combining the largest and richest data set of its type available, formed in partnership with world-leading immunological research organizations, with its own machine learning technology to deliver analytics at unprecedented scale.

I hope it doesnt sound corny, but we dont have the luxury to move more slowly, explained Immunai co-founder and CEO Noam Solomon in an interview. Because I think that we are in kind of a perfect storm, where a lot of advances in machine learning and compute computations have led us to the point where we can actually leverage those methods to mine important insights. You have a limit or ceiling to how fast you can go by the number of people that you have so I think with the vision that we have, and thanks to our very large network between MIT and Cambridge to Stanford in the Bay Area, and Tel Aviv, we just moved very quickly to harness people to say, lets solve this problem together.

Solomon and his co-founder and CTO Luis Voloch both have extensive computer science and machine learning backgrounds, and they initially connected and identified a need for the application of this kind of technology in immunology. Scientific co-founder and SVP of Strategic Research Danny Wells then helped them refine their approach to focus on improving efficacy of immunotherapies designed to treat cancerous tumors.

Immunai has already demonstrated that its platform can help identify optimal targets for existing therapies, including in a partnership with the Baylor College of Medicine where it assisted with a cell therapy product for use in treating neuroblastoma (a type of cancer that develops from immune cells, often in the adrenal glands). The company is now also moving into new territory with therapies, using its machine learning platform and industry-leading cell database to new therapy discovery not only identifying and validating targets for existing therapies, but helping to create entirely new ones.

Were moving from just observing cells, but actually to going and perturbing them, and seeing what the outcome is, explained Voloch. This, from the computational side, later allows us to move from correlative assessments to actually causal assessments, which makes our models a lot more powerful. Both on the computational side and on the lab side, this are really bleeding edge technologies that I think we will be the first to really put together at any kind of real scale.

The next step is to say, Okay, now that we understand the human immune profile, can we develop new drugs?, said Solomon. You can think about it like weve been building a Google Maps for the immune system for a few years so we are mapping different roads and paths in the immune system. But at some point, we figured out that there are certain roads or bridges that havent been built yet. And we will be able to support building new roads and new bridges, and hopefully leading from current states of disease or cities of disease, to building cities of health.

Read the original:
Immunai raises $60M as it expands from improving immune therapies to discovering new ones, too - TechCrunch

Read More..

Key Performance Metrics that Measure Impact of AIOps on Enterprises – eWeek

Staffing levels within IT operations (ITOps) departments are flat or declining, enterprise IT environments are more complex by the day and the transition to the cloud is accelerating. Meanwhile the volume of data generated by monitoring and alerting systems is skyrocketing, and operations teams are under pressure to respond faster to incidents.

Faced with these challenges, companies are increasingly turning to AIOps--the use of machine learning and artificial intelligence to analyze large volumes of IT operations data--to help automate and optimize IT operations. Yet before investing in a new technology, leaders want confidence that it will indeed bring value to end users, customers and the business at large.

Leaders looking to measure the benefits of AIOps and build key performance indicators (KPIs) for both IT and business audiences should focus on key factors such as uptime, incident response and remediation time and predictive maintenance, so that potential outages affecting employees and customers can be prevented.

Business KPIs connected to AIOps include employee productivity, customer satisfaction and web site metrics such as conversion rate or lead generation. Bottom line, AIOps can help companies cut IT operations costs through automation and rapid analysis; and it can support revenue growth by enabling business processes to run smoothly and with excellent user experiences.

These common KPIs, provided for this eWEEK Data Points article by Ciaran Byrne, VP of Product Management atOpsRamp, can measure the impact of AIOps on business processes.

This KPI refers to how quickly it takes for an issue to be identified. AIOps can help companies drive down MTTD through the use of machine learning to detect patterns, block out the noise and identify outages. Amid an avalanche of alerts, ITOps can understand the importance and scope of an issue, which leads to faster identification of an incident, reduced down time and better performance of business processes.

Once an issue has been detected, IT teams need to acknowledge the issue and determine who will address it. AIOps can use machine learning to automate that decision making process and quickly make sure that the right teams are working on the problem.

When a key business process or application goes down, speedy restoration of service is key. ITOps plays an important role in using machine learning to understand if the issue has been seen previously and, based on past experiences, to recommend the most effective way to get the service back up and running.

Often expressed in terms of percentage of uptime over a period of time or outage minutes per period of time, AIOps can help boost service availability through the application of predictive maintenance.

Increasingly, organizations are leveraging intelligent automation to resolve issues without manual intervention. Machine learning techniques can be trained to identify patterns, such as previous scripts that had been executed to remedy a problem, and take the place of a human operator.

IT operations should be able to detect and remediate a problem before the end user is even aware of it. For example, if application performance or Web site performance is slowing down by milliseconds, ITOps wants to get an alert and fix the issue before the slowdown worsens and affects users. AIOps enables the use of dynamic thresholds to ensure that alerts are generated automatically and routed to the correct team for investigation or auto-remediated when policies dictate.

The use of AIOps whether to perform automation or more quickly identify and resolve issues will result in savings both in operator time and business time to value. These have a direct impact on the bottom line.

These KPIs can be correlated to business KPIs around user experience, application performance, customer satisfaction, improved e-commerce sales, employee productivity, and increased revenue. ITOps teams need the ability to quickly connect the dots between infrastructure and business metrics so that IT is prioritizing spend and effort on real business needs. Hopefully, as machine learning matures, AIOps tools can recommend ways to improve business outcomes or provide insights as to why digital programs succeed or miss the mark.

If you have a suggestion for an eWEEK Data Points article, email cpreimesberger@eweek.com.

The rest is here:
Key Performance Metrics that Measure Impact of AIOps on Enterprises - eWeek

Read More..

Cornami and Inpher deliver Fully Homomorphic Encryption to modern innovators – Help Net Security

Cornami and Inpher announced their partnership to collaborate on delivering commercially viable Fully Homomorphic Encryption (FHE) functionality to the market.

FHE has long been described as transformative for data privacy and cloud security as it enables computing on encrypted data sets, thereby keeping the underlying data secure. However, existing FHE algorithms are computationally intensive and have been often considered as not yet practical for real world applications.

Cornamis partnership with Inpher overcomes such limitations to deliver real-time FHE computing to a ready and rapidly expanding market.

Inpher is a cryptographic Secret Computing company that powers privacy-preserving AI and analytics. Secret Computing has brought years of academic research in secure Multi-Party Computation (MPC) and FHE into commercially-ready applications.

With Secret Computing technology, data scientists can finally build advanced analytics and Artificial Intelligence (AI) models on distributed data sources without ever exposing or transferring sensitive data across departments, organizations, or jurisdictions.

Secret Computing ensures that data privacy and security are retained under the privacy requirements of organizations during these computations, including being GDPR compliant.

Cornami has developed a break-through computing-hardware architecture that massively scales application performance without penalties to deliver real-time computing for critical and complex applications that are out of reach of traditional, legacy processors.

The Cornami technology reduces power and latency while vastly increasing the compute performance for todays immense datasets requiring real-time computing.

Cornamis unique and scalable computational fabric is the first compute platform weve seen that can meet FHEs substantial computing requirements hitting both performance and cost requirements for a commercially viable product.

We look forward to working with Cornami to provide unrestricted FHE solutions to the market, said Dimitar Jetchev, CTO and co-founder of Inpher, Inc.

The two companies are collaborating to enable Inphers advanced cryptographic products, in which data remains protected while being processed, to execute on the Cornami computing platform.

Cornamis unique next-generation post-von Neumann architecture enables massive parallelism and pipelining capabilities at scale, delivering compute-intensive FHE for commercially viable real-time products and services.

Commercially viable FHE provides quantum-secure privacy-preserving computing on encrypted data sets, keeping the underlying data secure. In essence, assume that computing environments are compromised; so, secure the data.

The data, including its unrestricted computational derivatives, remain encrypted at rest and throughout its processing life cycle. Data is only decrypted to plaintext in data owner-controlled environments.

FHE is based on notoriously hard mathematical lattice problems that are secure against attacks even by quantum computers. This feature allows provable data security even in untrusted computing environments like public cloud platforms.

Datasets from single or multiple sources can be encrypted using the same public FHE key. The data is then aggregated into an encrypted database, where AI (both training and inference) and analytic algorithms can be applied at commercially viable speeds.

Results can be decrypted by a third-party without ever exposing plaintext data. This feature enables the cooperation of multiple, independent organizations such as healthcare companies, governments, financial institutions and more to gain insights into their collective data while preserving the privacy of that datas contents.

In order to achieve value from data analytics, including those driving the AI and ML market, the industry needs to address the protection of the data.

Massive security breaches bring regulatory pressure to restrict data collection and cloud-based information with the realization that regardless of effort, with existing perimeter-based security measures, computing environment attack points will always exist.

The cryptography experts at Inpher have developed quantum-safe cryptographic primitives and protocols to encrypt data and perform arbitrary computation on that encrypted data that returns an encrypted result.

Combining both Cornami and Inpher technologies will enable future-proof and provable data security. These are mathematical computationally-intensive algorithms highly unsuitable for the processors of today.

We greatly value our partnership with Inpher to enable for the first time the performance required to deliver commercially viable FHE products in real-time, said Paul Master, CTO and co-founder of Cornam.

Cornami and Inpher have completed several technical exploratory milestones to understand their harmonious fit. Both companies are engaged in delivering market ready solutions that address multiple customers needs to compute on encrypted datasets.

Go here to read the rest:
Cornami and Inpher deliver Fully Homomorphic Encryption to modern innovators - Help Net Security

Read More..

Army Ant Limited Announces the Release of Its Encryption Mixer ANTUSDT – Yahoo Finance

ANTUSDTANTUSDTANTUSDT

LONDON, Feb. 10, 2021 (GLOBE NEWSWIRE) -- Recently, Army Ant has announced the release of its encryption mixer (coin shuffle), ANTUSDT.

"Weak Anonymity" vs. "Black Money", which one would users prefers? Digital currency is actually not completely anonymous (pseudo-anonymity), so the encryption mixer was born ANTUSDT.

The digital currency address will not be linked to a real identity in real life, but if it is not done properly, it will. People can trace back to a particular transaction by associating multiple nodes in the blockchain, and then through the analysis of blockchain data and KYC/AML data, people can know who sent the transaction, and even include more details, such as location, reason of the transaction and so on.

The principle of coin shuffle:

What is coin shuffle?

Speaking of digital currency, it is easy to think of its two characteristics, one is decentralization, the other is anonymity. But the anonymity of cryptocurrencies is limited.

Although real-name authentication is not required, and the users real identity cannot be matched by the address, the transaction on the blockchain is public. If someone deliberately looks for it, some clues can also be found through big data analysis, unless everyone is like Satoshi Nakamoto who is as cautious as possible. However, there is a service that can provide users with sufficiently high privacy protection, that is, coin shuffle services.

Coin shuffle, as the name implies, is to mix coins from different issuing addresses and then send them out. Through this process, the correspondence between the output address and input address of a transaction is cut off, thereby better protecting the privacy of users.

In fact, the process of coin shuffle is like a lot of people throwing coins into the wishing pool. If everyone throws in coins of 1 yuan, then people can know how many 1-yuan coins who throws in at what time, but when the staff cleans these coins and sorts them out, people cannot tell which coin was thrown in by whom.

Story continues

Since the Bitcoin blockchain is a public ledger, it records every transaction on the Bitcoin network, and after coin shuffle, it is impossible to know which incoming transaction should correspond to which outgoing transaction. It is difficult to find out where and how much the cryptocurrency of the trader is stored, which protects user privacy to the greatest extent.

Under normal circumstances, coin shuffle for multiple times, with a small number of coins each time, is better.

Conclusion:

Coin shuffle is a privacy protection function, and coin shuffle transactions are difficult to track. The funds are mixed with the funds of other users, and a random relationship is created between the existing user's account system and the new account after coin shuffle. This mechanism can realize the anonymity of transactions and the anonymity of all services.

ANTUSDT platform advantages

According to market research, all coin mixers on the market are open to merchants, and ANTUSDT is the only coin mixer that is open to merchants and retail investors.

ANTUSDT's business scope also includes: cryptocurrency collection agenting, payment agenting, coining cash lending, coining currency borrowing, multi-country and multi-platform acceptance, etc.

How does ANTUSDT coin shuffle make profit

Merchants need to use the corresponding currency, retail merchants supply and merchants use, and merchants give corresponding commissions (to ensure that the fund pool has sufficient spare assets to provide and withdraw coins, every coin shuffle requires the user to actively authorize it, only by obtaining the user's authorization can assets can assist in coin shuffle.

Cooperative merchants have paid a deposit equivalent to 25 BTC and a full liquidity deposit of more than 5 times to ensure zero risk in ANTUSDT business.

Introduction of ANTUSDT rules

Ordinary users can earn coin shuffle commissions by participating in two-coin shuffle methods through a single authorization.

ANTUSDT cross-chain coin shuffle

ANTUSDT creates a large-scale coin shuffle pool, which gathers the most circulating currencies on the market.

Join the cross-chain coin shuffle via authorization, their assets will be mixed with hundreds of thousands of cryptocurrencies around the world. Through countless asset interactions, the traces of funds of participating users can be completely concealed to achieve the effect that the ocean can bleach ink.

ANTUSDT has a large number of cooperative merchants. The coin shuffle pool can handle a large number of assets with privacy requirements 24 hours a day, and provide encrypted whereabouts and currency exchange services for assets. Every time ordinary users are authorized to participate in cross-chain coin shuffle, they can obtain the commission earned by the coin shuffle pool.

Reciprocating acceptance coin shuffle

Join reciprocating acceptance coin shuffle via authorization, their assets will be on standby at any time for major merchants to call, during which their assets may switch between various currencies. After applying for the call, the user needs to press the "Re-exchange" button in the order on the coin shuffle details page.

Large merchants will restore their assets to the original digital currency within the specified time and pay the corresponding commission.

In the fiat currency world, this problem can be traced back to a real legal case in the 17th century. The conclusion of the case is that-if users receive a banknote involved in theft, the police will later investigate that the banknote was stolen and traded several times before it reaches their hand. In this case, the police have no right to take this banknote from their account. The same is true for digital currencies. What does it matter if users receive a coin that is not stolen by them?

Media contact

Company: Army Ant Limited

Contact: Hagimoto Madoka

E-mail: vip@antusdt.com

Address: 29 CLEMENTS ROAD ILFORD LONDON UNITED KINGDOM IG1 1BH

YouTube: https://youtu.be/alffT8t2oOs

Telegram: @antusdt001

Website: https://www.antusdt.com/

SOURCE: Army Ant Limited

A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/164aa13b-3e4b-4565-a795-88d7050efdcd

Read this article:
Army Ant Limited Announces the Release of Its Encryption Mixer ANTUSDT - Yahoo Finance

Read More..

A natural process may hold the clue to secure encryption of communications – Research Matters

Solid particles suspended in a fluid are bombarded on all sides by particles of the fluid. They change their direction of motion haphazardly, and the distances they travel between one such collision and another are not predictable. Hence, their motion is said to be random. Discovered by the botanist Robert Brown in 1827 and named after him, Albert Einstein theoretically explained Brownian motion in 1905. According to the theory, it is impossible to trace the motion of individual particles of the solid, which introduces inherent randomness to the particles motion.

Generating a truly random sequence of numbers holds the key to secure encryption of data. However, the task remains a challenge. Although computer scientists rely on several numerical techniques to generate random numbers, the methods always introduce a small degree of predictability in the sequence. In a new study, researchers from the Indian Institute of Science Education and Research, Kolkata, have used the naturally occurring Brownian motion to generate random numbers. Funded by the Department of Science and Technology and Ministry of Human Resource and Development, Government of India, the study was published in the journal Frontiers in Physics.

The researchers first suspended plastic particles about 3 microns in diameter in a mixture of water and common salt. They tracked the individual particles using a laser focused on these particles, at a rate more than thousand per second. The laser they used does not affect the Brownian motion of the particle it probes, relying on the invention called optical tweezers, which was awarded the Nobel Prize in 2018. The researchers recorded the position of the individual particles in a computer and studied their motion.

Any instrument used in the laboratory first needs calibration that is, the parameters that determine the instruments behaviour need to be quantified to enable reliable measurements. First, the researchers used the position of the individual plastic particles to calibrate the optical tweezers. They used a numerical technique called machine learning on the data they had collected. It first used a section of the data to identify the numerical parameters responsible for driving different kinds of motion of the particles. Then, it used the rest of the data to identify those parameters that accurately described the motion they had observed. The researchers demonstrated that indeed, the optical tweezers they used were behaving as expected. When small anomalies arose, the researchers showed that these deviations would not affect the data they recorded for the particles motion.

The researchers used their machine learning technique to demonstrate that the plastic particles indeed undergo random motion in the salt-water solution. They showed that when they measured the parameters relevant to the motion using their method, the results match those of traditional techniques, such as the Fourier transform, named after its discoverer Joseph Fourier. Their approach, however, requires minimal human interaction.

Thus, we provide new insights on calibrating Optical Tweezers, says Raunak Dey, the lead author of the study.

The researchers then used the trajectory of the individual particles to generate a sequence of numbers. They conducted a series of 15 standard tests of the randomness of a sequence of numbers. The tests tell us whether a string of numbers qualifies as random, and also allows us to judge the extent of randomness, shares Raunak. The researchers concluded that indeed, the numbers are random up to a high degree of confidence. They further showed that if the rate at which the laser observes the particles is higher, it gives them increasingly more certainty about the randomness of the sequence. We have directly borrowed these random numbers from nature instead of relying on artificial algorithms, adds Raunak.

The numerical technique also enabled them to independently calculate the parameters of the fluid in which the particles moved, like how much internal friction it offers for its flow. They demonstrated that these independent measurements agree with the well-known values for the fluid. It can have interesting implications. Measuring fluid parameters is difficult for certain biological fluids, like living cells, or fluids which sometimes behave like a solid, like molten lava.

It is difficult to use conventional techniques to study these special fluids, but we can now apply our technique on these systems, Raunak signs off.

This article has been run past the researchers, whose work is covered, to ensure accuracy.

The rest is here:
A natural process may hold the clue to secure encryption of communications - Research Matters

Read More..

What Is End-to-End Encryption, and Why Does It Matter? – How-To Geek

Titima Ongkantong/Shutterstock.com

End-to-end encryption (E2EE) ensures that your data is encrypted (kept secret) until it reaches an intended recipient. Whether youre talking about end-to-end encrypted messaging, email, file storage, or anything else, this ensures that no one in the middle can see your private data.

In other words: If a chat app offers end-to-end encryption, for example, only you and the person youre chatting with will be able to read the contents of your messages. In this scenario, not even the company operating the chat app can see what youre saying.

First, lets start with the basics of encryption. Encryption is a way of scrambling (encrypting) data so that it cant be read by everyone. Only the people who can unscramble (decrypt) the information can see its contents. If someone doesnt have the decryption key, they wont be able to unscramble the data and view the information.

(This is how its supposed to work, of course. Some encryption systems have security flaws and other weaknesses.)

Your devices are using various forms of encryption all the time. For example, when you access your online banking websiteor any website using HTTPS, which is most websites these daysthe communications between you and that website are encrypted so that your network operator, internet service provider, and anyone else snooping on your traffic cant see your banking password and financial details.

Wi-Fi uses encryption, too. Thats why your neighbors cant see everything youre doing on your Wi-Fi networkassuming that you use a modern Wi-Fi security standard that hasnt been cracked, anyway.

Encryption is also used to secure your data. Modern devices like iPhones, Android phones, iPads, Macs, Chromebooks, and Linux systems (but not all Windows PCs) store their data on your local devices in encrypted form. Its decrypted after you sign in with your PIN or password.

RELATED: Why Does Microsoft Charge $100 for Encryption When Everyone Else Gives It Away?

So encryption is everywhere, and thats great. But when youre talking about communicating privately or storing data securely, the question is: Who holds the keys?

For example, lets think about your Google account. Is your Google datayour Gmail emails, Google Calendar events, Google Drive files, search history, and other datasecured with encryption?

Well, yes. In some ways.

Google uses encryption to secure data in transit. When you access your Gmail account, for example, Google connects via secure HTTPS. This ensures that no one else can snoop on the communication going on between your device and Googles servers. Your internet service provider, network operator, people within range of your Wi-Fi network, and any other devices between you and Googles servers cant see the contents of your emails or intercept your Google account password.

Google also uses encryption to secure data at rest. Before the data is saved to disk on Googles servers, it is encrypted. Even if someone pulls off a heist, sneaking into Googles data center and stealing some hard drives, they wouldnt be able to read the data on those drives.

Both encryption in transit and at rest are important, of course. Theyre good for security and privacy. Its much better than sending and storing the data unencrypted!

But heres the question: Who holds the key that can decrypt this data? The answer is Google. Google holds the keys.

Since Google holds the keys, thismeans that Google is capable of seeing your dataemails, documents, files, calendar events, and everything else.

If a rogue Google employee wanted to snoop on your dataand yes, its happenedencryption wouldnt stop them.

If a hacker somehow compromised Googles systems and private keys (admittedly a tall order), they would be able to read everyones data.

If Google was required to turn over data to a government, Google would be able to access your data and hand it over.

Other systems may protect your data, of course. Google says that it has implemented better protections against rogue engineers accessing data. Google is clearly very serious about keeping its systems secure from hackers. Google has even been pushing back on data requests in Hong Kong, for example.

So yes, those systems may protect your data. But thats notencryption protecting your data from Google. Its just Googles policies protecting your data.

Dont get the impression that this is all about Google. Its notnot at all. Even Apple, so beloved for its privacy stances, does not end-to-end encrypt iCloud backups. In other words: Apple keeps keys that it can use to decrypt everything you upload in an iCloud backup.

Now, lets talk chat apps. For example: Facebook Messenger. When you contact someone on Facebook Messenger, the messages are encrypted in transit between you and Facebook, and between Facebook and the other person. The stored message log is encrypted at rest by Facebook before its stored on Facebooks servers.

But Facebook has a key. Facebook itself can see the contents of your messages.

The solution is end-to-end encryption. With end-to-end encryption, the provider in the middlewhoever you replace Google or Facebook with, in these exampleswill not be able to see the contents of your messages. They do not hold a key that unlocks your private data. Only you and the person youre communicating with hold the key to access that data.

Your messages are truly private, and only you and the people youre talking to can see themnot the company in the middle.

End-to-end encryption offers much more privacy. For example, when you have a conversation over an end-to-end encrypted chat service like Signal, you know that only you and the person youre talking to can view the contents of your communications.

However, when you have a conversation over a messaging app that isnt end-to-end encryptedlike Facebook Messengeryou know that the company sitting in the middle of the conversation can see the contents of your communications.

Its not just about chat apps. For example, email can be end-to-end encrypted, but it requires configuring PGP encryption or using a service with that built in, like ProtonMail. Very few people use end-to-end encrypted email.

End-to-end encryption gives you confidence when communicating about and storing sensitive information, whether its financial details, medical conditions, business documents, legal proceedings, or just intimate personal conversations you dont want anyone else having access to.

End-to-end encryption was traditionally a term used to describe secure communications between different people. However, the term is also commonly applied to other services where only you hold the key that can decrypt your data.

For example, password managers like 1Password, BitWarden, LastPass, and Dashlane are end-to-end encrypted. The company cant rummage through your password vaultyour passwords are secured with a secret only you know.

In a sense, this is arguably end-to-end encryptionexcept that youre on both ends. No one elsenot even the company that makes the password managerholds a key that lets them decrypt your private data. You can use the password manager without giving the password manager companys employees access to all your online banking passwords.

Another good example: If a file storage service is end-to-end encrypted, that means that the file storage provider cant see the contents of your files. If you want to store or sync sensitive files with a cloud servicefor example, tax returns that have your social security number and other sensitive detailsencrypted file storage services are a more secure way to do that than just dumping them in a traditional cloud storage service like Dropbox, Google Drive, or Microsoft OneDrive.

Theres one big downside with end-to-end encryption for the average person: If you lose your decryption key, you lose access to your data. Some services may offer recovery keys that you can store, but if you forget your password and lose those recovery keys, you can no longer decrypt your data.

Thats one big reason that companies like Apple, for example, might not want to end-to-end encrypt iCloud backups. Since Apple holds the encryption key, it can let you reset your password and give you access to your data again. This is a consequence of the fact that Apple holds the encryption key and can, from a technical perspective, do whatever it likes with your data. If Apple didnt hold the encryption key for you, you wouldnt be able to recover your data.

Imagine if, every time someone forgets a password to one of their accounts, their data in that account would be wiped out and become inaccessible. Forget your Gmail password? Google would have to erase all your Gmails to give you your account back. Thats what would happen if end-to-end encryption was used everywhere.

Here are some basic communication services that offer end-to-end encryption. This isnt an exhaustive listits just a short introduction.

For chat apps, Signal offers end-to-end encryption for everyone by default. Apple iMessage offers end-to-end encryption, but Apple gets a copy of your messages with the default iCloud backup settings. WhatsApp says that every conversation is end-to-end encrypted, but it does share a lot of data with Facebook. Some other apps offer end-to-end encryption as an optional feature that you have to enable manually, including Telegram and Facebook Messenger.

For end-to-end encrypted email, you can use PGPhowever, its complicated to set up. Thunderbird now has integrated PGP support. There are encrypted email services like ProtonMail and Tutanota that store your emails on their servers with encryption and make it possible to more easily send encrypted emails. For example, if one ProtonMail user emails another ProtonMail user, the message is automatically sent encrypted so that no one else can see its contents. However, if a ProtonMail user emails someone using a different service, theyll need to set up PGP to use encryption. (Note that encrypted email doesnt encrypt everything: While the message body is encrypted, for example, subject lines arent.)

RELATED: What Is Signal, and Why Is Everyone Using It?

End-to-end encryption is important. If youre going to have a private conversation or send sensitive information, dont you want to make sure that only you and the person youre talking to can see your messages?

Read the original post:
What Is End-to-End Encryption, and Why Does It Matter? - How-To Geek

Read More..

The new avatar of the encryption wars – Hindustan Times

The government has proposed a new bill to regulate mathematics. The bill envisages that certain mathematical operations such as multiplication, division, LCM and GCD would be banned, if they are prime numbers and have more than 309 digits and a licensing regime, which would only allow licensed entities to perform these operations.

If the above reads like a parody, it may soon cease to be and become reality.

An Australian Prime Minister, Malcolm Turnbull declared in 2017 that, The laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia.

In a joint communique issued on October 11, 2020, the Five Eye nations (United States, United Kingdom, Australia, New Zealand, Canada), along with Japan and India, stated, Particular implementations of encryption technology... pose significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children and called upon technology companies to enable law enforcement access to content in a readable and usable format where an authorisation is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight.

The specific implementation of encryption technology that has worried governments the world over is the Signal protocol (E2EE), which guarantees that even intermediaries who provide these services will not be able to decrypt these messages in transit. It also guarantees plausible deniability, where if someone receives an encrypted message from you, they can be absolutely sure you sent it (rather than having been forged by some third party), but cant prove to anyone else that it was a message you wrote.

A variation of their anxieties played out in India, in the WhatsApp traceability debate, where the government pushed for traceability (Tell me who the sender is), but also said that it does not want to break end-to-end encryption, an impossible request, as sender deniability is at the heart of the end-to-end encryption. When repeatedly rebuffed by WhatsApp, an attempt was made to resolve the matter through the judicial system to compel the intermediaries (WhatsApp) to stop deploying messaging systems that use E2EE.

Given this background, the use of children in the statement to build a case for banning E2EE is interesting because it uses a propaganda technique called Pedophrasty, where children are invoked to prop up an argument, and make the opponents against the argument look like unprincipled savages and make everyone else suspend all rational and critical thinking, and agree to the argument.

But we must not agree to this dangerous set of proposals, as they are a continuum to the encryption wars, which started in the 1970s, where Western governments tried to limit use of encryption technologies by using export controls and ultimately failed.

In the 1990s, the National Security Agency in the US proposed the use of Clipper Chip in every phone, which implemented encryption but gave backdoor access to the US government. After Matt Blaze showed how rogue applications can use the chip to access data without the government backdoor, this attempt was abandoned.

In 2010, Google published a blog post, detailing how Chinese state backed hackers, attacked Gmail to spy on Chinese human rights advocates via a backdoor, installed by Google at the behest of the US government in Gmail to comply with search warrants on users. When Ericsson put backdoors into Vodafone products and deployed these in Greece for aiding law enforcement, these backdoors were used to spy on the Greek prime minister, by unknown perpetrators, who were never found.

All these incidents point out two fundamental realities. The first one is that backdoors are always dual-use and can be used by anyone and, hence, they dont keep anyone safe. The second is that E2EE is safe and easy enough for anyone to use and hence has achieved mainstream adoption. This has made the usual approach preferred by law enforcement agencies of coercing intermediaries to put backdoors irrelevant and obsolete.

Outlawing E2EE deployment and forcing intermediaries to comply with these proposed rules or leave the country by threatening to shut down their business operations, hence, may become the preferred policy response. But these rules, even if they become the law everywhere, are doomed to fail, in the same way, the discovery of irrational numbers (square root of 2) could not be suppressed by drowning its inventor Hippasus, in the sea, as it takes only a rented computer at 700 a month to run a back-end service implementing E2EE.

If existing intermediaries are forced to abandon it, others like EncroChat (popular among drug cartels) will step in and fill the void. The busting of EncroChat, when law enforcement agencies successfully penetrated the drug cartels by putting a tool in its servers, also indicates that it is possible to work around E2EE in some cases, using offensive technical measures by compromising endpoints. It would also be a far more proportionate measure than attempting to ban mathematical equations.

Anand Venkatanarayanan researches disinformation, cyber weapons and data security and is a privacy advocate

The views expressed are personal

See original here:
The new avatar of the encryption wars - Hindustan Times

Read More..