Page 1,743«..1020..1,7421,7431,7441,745..1,7501,760..»

Trends and Developments in Data Handling 2022 – LCGC Chromatography Online

A snapshot of key trends and developments in data handling according to selected panellists from the chromatography sector.

Q. What is currently the biggest problem in data management for chromatographers?

Christoph Nickel: Currently one of the major challenges is the increasing number of samples to run, analyses to conduct, and data to review while keeping data quality high and detecting any potential error. A major driver for this is increasingly complex separation and detection techniques that are required to analyze biotherapeutics. The result being that the chromatographer increasingly needs to use mass spectrometry (MS). Furthermore, the consolidation of all this information into an easily viewable and sharable format at a central location is a massive challenge. This is particularly important for information that is required to take an informed final review and approval. A typical example is the weighing results for calibration standards generated from a balance that should be connected to the calibration data in the chromatography data system (CDS) for confirmation of proper calibration and eventual accurate quantitation of the unknown compounds.

Ofrit Pinco: One of the biggest challenges for chromatographers is that data from different vendors cannot be incorporated together and analyzed collectively due to a lack of a unified data format. Chromatographers can only review data from one system at a time and answer specific questions. This makes it harder to access and conduct secondary analysis across multiple data systems. To address this challenge, several pharmaceutical companies have sponsored the Allotrope Foundation (1) whose initiative is to unify data formats. In addition, some startups are building tools to translate data into a common format. However, both initiatives will take some time and collaboration to overcome this challenge.

Anne Marie Smith: Chromatographers use a variety of different instruments from various vendors, each with their own proprietary data formats. One big problem area is bringing together and managing the data from the different electronic systems. The ability to normalize all that disparate data while retaining the ability to interrogate it, as in native data processing software, is very beneficial to chromatographers. Since chromatography data are so ubiquitous, effective management in a central, accessible place is essential.

Bjrn-Thoralf Erxleben: Handling large quantities of data requires a lot of time for data processing and interpretation. Additionally, depending on the local situation, secure data storage and archiving can be time consuming, and administration of these processes gets more and more complex.

Although there are network-based multiinstrument-capable CDSs, all vendors support and maintain their proprietary data format firstdata file formats for photodiode array detectors (PDA) and for MS instruments are closed. Even when providing drivers to other CDS systems, still several requests/wishes are not satisfied. Hardware-wise, hybrid configurations may contain different operation workflows, and parameters cannot easily be transferred between vendors. Direct comparison of data between different instruments can be difficult.

Q. What is the future of data handling solutions for chromatographers?

Christoph Nickel: I see three main trends: first, radically streamlined and simplified user experience with more fit-for-purpose applications; second, an agglomeration of data from different sources in a single central repository in a consolidated formatoften referred to as a Data Lake. This will reduce the time for data review/analysis because it eliminates any manual data transfers or manual consolidation of spreadsheets or PDF files. Third, more and more automation of routine tasks using machine learning (ML) (for routine reviews) and algorithmassisted data mining to identify patterns, trends, outliers, or deviations.

In addition, data will continue to become available anywhere, anytime, so there will be no further need to be in the laboratory or at the instrument to analyze your data, and no need for installation and maintaining software applications on your device. Everything will be available online.

Ofrit Pinco: The future of data handling goes well beyond acquiring and analyzing data generated by a single chromatography system. As new tools and solutions are being developed, and as researchers are being expected to extract more information from their samples, chromatographers will need to access and analyze data from multiple instruments and data systems simultaneously. Right now, chromatographers have multiple tools to help them focus on multiple areas, but future tools will allow them to review information from the whole workflow in one space. This has potential to enable researchers to answer more questions. This will also be valuable as requirements and regulations for compliance become stricter. New tools will also give research teams insight into historical instrument performance data, leading to increased operational efficiency and even predictive maintenance. Data handling will only continue to become more streamlined and more advanced through the utilization of these types of tools combined with artificial intelligence (AI) and ML. These are the next steps needed to reach the lab of the future and Industry 4.0.

Anne Marie Smith: The cloud is the future of data handling. All systems will connect to the cloud. Its secure, simplifies the infrastructure thereby reducing costs, provides better performance, and is scalable. Depending on the system you choose, it can also be future proof. It is important, however, that systems architects take into account the scientists data access requirements. Whether the data needs to be accessed immediately upon generation, or at a later date, should inform how data management solutions are architected to ensure a seamless transition to cloud-based data access.

Bjrn-Thoralf Erxleben: We already see cloud-based data storage options at several CDS installations, and this trend will continue because it renders data access and sharing far easier. At the same time, this will require a new level of data security and data protection. A positive aspect is that data storage and archiving is outsourced and will not bind IT resources on-site.

AI software will be implemented in standard software, for peak picking, processing, and identification using database packages. Selflearning algorithms will support method optimization and provide an estimation of retention time, based on structural information of the analyte.

Developing and maintaining special programs and databases for research use is a time- and resource-intensive task. If such a standard is being accepted and used in the industry, instrument vendors have to provide data compatible with these programs. There might be also agreements about new standard data formats, which will be used or supported via conversion.

Last, but not leastit would be nice to see that workflows and parameter definition is adjusted between the vendors and that data processing, at least for two-dimensional (2D) data, becomes a common piece of software accessible via the web, to be used by all chromatographers, after logging on to the dedicated cloud.

Q. What one recent development in Big Data is most important for chromatographers from a practical perspective?

Christoph Nickel: While it might sound boring, the biggest impact on the analytical laboratory is the ability to bring data together from all instruments and all devices working on the same function, the same research, or the same discipline. The availability of data in one location is a mandatory prerequisite for every analysis, insight, or application of algorithms. So, any effort that chromatographers can make to bring their data together brings them a major step closer to fast, efficient, and error-free analysis, moving from reactive review or error handling to proactive problem prevention. This can be realized from the availability of unlimited computing power in the cloud, which is becoming more mainstream for deployment of globally connected systems.

Ofrit Pinco: AI and ML have been growing rapidly in the last few years and people are realizing more of their advantages on a daily basis. Take search engines for example, Google has drastically changed the way we search for answers, plan our travels, and consume news. As AI and ML technologies mature, more scientists with this skill set will enter the chromatography field and apply these technologies to the laboratory.

In the current state, chromatographers analyze data based on specific questions, with the aim of confirming predefined hypotheses. Through AI and ML, chromatographers may be able to uncover new trends and patterns from a large set of unstructured data, giving chromatographers insights they didnt know existed. This will greatly facilitate the progress of scientific research in the long run.

Anne Marie Smith: AI and ML can help find relationships and insights in an otherwise overwhelming amount of data, providing potential predicted outcomes. While AI and ML can drastically improve processes, it is only as good as the data that is input. For instance, for chromatographers where there are a multitude of possible instrument combinations, if data collection is of poor quality or incomplete, the results may be biased.

Bjrn-Thoralf Erxleben: Analytical intelligence features such as autorecovery, start-up, self-check, and feedback. Apart from additional automation, this enables quick and easy hardware diagnostics and helps to decrease downtime of the systems. By applying more process analytical techniques (PAT) features and more feedback from the system to the central server, chromatographers can focus on their work and need to care less for the hardware.

Q. What obstacles do you think stand in the way of chromatographers adopting new data solutions?

Christoph Nickel: One of the greatest challenges is the need to comply with good manufacturing practice (GMP) and data integrity guidelines. The validation guidelines were drafted for on-premise deployment of software, and laboratories now need to transform their validation principles into a more decentralized, globally connected world with often abstracted storage. In simple termsthe demands to prove that your data integrity is maintained now require you to include at least one additional playerthe host of your data. This increases complexity of your validation tasks and requires a change in thinking and conducting validation.

Another significant obstacle is the potential delay of data access driven from the need to transfer the data from the laboratory to the central location/entity and access it there. While the internet and cloud performance are fast enough to provide a positive user experience, the in-house infrastructure is often the rate-limiting step. For example, a single low performance element in your local area network such as an old 10 MB switch can slow down your entire data traffic by a factor of 10. Suitability of the infrastructure is a critical prerequisite for transferring the data into the central repositories and increases dependency on your IT infrastructure.

Ofrit Pinco: A few factors contribute to this slow adoption. First is the complex laboratory ecosystem. Due to the interconnectedness of systems and solutions, any change must be evaluated for its impact on all components within the ecosystem. Also, downtime needs to be minimized, as many laboratories are in production and operate on a 24/7 schedule. After implementation, regulated labs require validation for the change. Additional training is also required for technicians to adopt new standard operating procedures (SOPs) and avoid errors. As a result, adopting new solutions is difficult and time-consuming.

Anne Marie Smith: Adopting new data solutions is a daunting task. It involves time to set up the system in a useful way, time for validation and implementationensuring the system meets the data integrity requirements and ensuring the data are secureand time to learn the new system. These factors often lead to reluctance to change, which can stand in the way of adoption of useful solutions.

Bjrn-Thoralf Erxleben: Changing an established workflow is a critical matter for analytical laboratories and operators do not always come with a strong analytical background and experience. New user interfaces, new operation workflows, and, in the worst cases, new definitions for known parameters in the software present a lot of training for users until a new solution is finally adapted. Risk of longer downtime is high. Right now, we are confronted with objection to installation of necessary service packs or patches to be compatible with modern operating systems and virus protection.

New features and functionality need to prove their advantage first before new software is rolled out and established. Another aspect is the data comparison and transfer, what happens with the old data? Legislations require that old data and results have to be kept and provided for inspection if neededis maintaining a piece of the old software a good solution? Especially when it means some knowledge of how to operate it needs to be available.

Q. What was the biggest accomplishment or news in 2021/2022 for data handling?

Christoph Nickel: The adoption of the cloud with unlimited storage, computing power that enables data agglomeration, and new levels of advanced and super-fast analysis of data.

Ofrit Pinco: In the past two years, more data scientists have entered and brought changes to the analytical industry. Data scientists are skilled at analyzing and extracting insights from structured and unstructured data by using scientific methods, algorithms, and systems. They can be good complementary partners to application scientists, who have backgrounds in chemistry and understand the cases and workflows in the laboratory. Together with application scientists, data scientists can utilize models and algorithms to analyze and visualize historical data and let application scientists relate new findings to workflows and experiments.

In addition to scientific findings, data scientists may also improve laboratory operation efficiency by evaluating instrument performance and data management metrics. Data scientists may provide new perspectives on how laboratories can better store, organize, and manage data.

Anne Marie Smith: Streaming live data as its acquired locally and storing it in a cloud instance of a CDS has improved IT systems. With the recent development in Big Data, this simplifies data movement for downstream data analytics.

Reference

Christoph Nickel is the Director of Software Marketing at ThermoFisher Scientific.

Ofrit Pinco is Senior Product Manager at Agilent Technologies.

Anne Marie Smith is Product Manager, Mass Spectrometry & Chromatography at ACD/Labs.

Bjrn-Thoralf Erxleben is Senior Manager at Shimadzu Europa in charge of the pharmaceutical and biopharmaceutical market.

Read more:

Trends and Developments in Data Handling 2022 - LCGC Chromatography Online

Read More..

Process mining: Digital transformation lynchpin in banking & finance ERP Today – ERP Today

Technology is now central to the operations of almost all modern businesses, and this is especially true for organisations in the finance sector. But as the sector embraces digital transformation, are business leaders overlooking a key way to ensure they are delivering the experiences that customers need, while also trimming their own costs?

Spurred by consumer demand, businesses in the banking and finance industries are leveraging technologies from cloud to artificial intelligence to improve customer experience, drive efficiencies and win new business. This shift has advanced to an extent few would have predicted, with 85 percent of CEOs having accelerated digital initiatives despite the challenges of the pandemic, according to research by Deloitte.

These businesses, though, first need to understand what to transform, and if the strides theyre making are actually working this is where process mining offers an opportunity. Process mining helps businesses understand their operations in a way that was previously impossible, gaining insights about how their business processes actually run, rather than how they think they might run, to truly be successful in digitally transforming.

Failing to make the leap

For some time now, companies in the banking and finance industry have been trying to keep up with the latest technologies, risking being overtaken by digital-native rivals if not. Faced with start-ups and challenger banks powered by technologies such as machine learning, incumbents cannot afford to delay any longer.

While the pandemic accelerated a shift towards embracing digitisation, hurdles still remain for businesses in 2022 according to research by The Hackett Group. The study finds that business leaders fail to engage with digital projects due to fears over inflation, skills gaps and productivity.

Companies must act quickly to embrace digital transformation. Acting like an X-ray machine for processes within a business, process mining looks for the points where processes become stuck, leading to delays or costly manual interventions.

Process mining delivers the data and insights leaders need to eliminate costly delays and hold-ups within their business operations and underpins successful digital transformation. Being able to find business processes gaps that impact customers and their needs is the first step to make businesses more efficient and resilient to meet todays supply chain and business disruption as well as inflation challenges.

There is still learning to be done though, and many business leaders are unaware of the full capabilities of process mining and execution management and their potential to increase business efficiency. This is one of the reasons that some banking and financial services providers are losing out to competitors.

How process mining can drive success in banking and finance

Process mining can speed up and improve efficiency across the whole value chain in the banking and finance sector. It works by using data generated from business processes, which are recorded in event logs by business software. Event logs are saved in scenarios like when a customer makes a request, a complaint is processed or a new credit application is made. Each stage of the process generates its own event log which the software can analyse. As a result, process mining can be applied to just about any process within banking and insurance businesses.

Mining helps understand how transformation is actually workingby measuring business impact

Process mining works across all stages of each process. This means leaders are not restricted to streamlining one part of the organisation, but are able to take a holistic overview and see possible starting points for a comprehensive digital transformation spanning multiple company divisions.

The technology digs into standard business data, making it transparent and revealing the hidden inefficiencies where manual processes are slowing the organisation down. It is built to iron out the problems that consume time, create inefficiency and waste businesses money. As businesses adopt new digital processes, process mining can pinpoint problems as they happen.

Why executionmanagement matters

Execution management works alongside process mining to drive intelligent fixes which improve business performance, taking the data and intelligence delivered by process mining and translating it into action. Once inefficiencies have been identified, execution management presents the next-best-action recommendations to eliminate inefficiencies. Business leaders can see where these inefficiencies lie and speed up and automate processes where necessary. This translates into satisfied customers, happier employees and increased profits plus reduced rework, better conversions and a more efficient workforce.

Process mining and execution management help banks with everything from regulatory reporting to payment processing, identifying pain points where needless manual interventions are present and enabling managers to automate these.

For example, process mining can highlight where related cases are being directed to multiple agents, rather than being grouped together and directed to a single team or agent. Such interventions can significantly reduce overall case volume, freeing up employee time, and cutting costs. Even in high-volume products such as foreign exchange trading, process mining can automatically and continuously map processes, delivering insights on where failures are occurring.

Banks using process mining and execution management have reported being able to process credit applications up to four times faster and reduce the lead time in the retail process by six months.

As businesses transform, process mining helps leaders understand how their transformation is actually working too, by measuring the impact on the business. This means that at every stage, the technology can weed out the time-consuming manual steps and speed institutions on the path towards seamless automation and greater efficiency.

It can also help to deliver improved consumer-grade customer services that people now expect in the workplace. After two years of experiencing digital-first services, consumers now demand reliable service and rapid problem resolution, according to a McKinsey report from 2020.

Providing good customer experience is a business imperative. Process mining and execution management allow businesses to continuously evaluate in real-time how customers respond, weed out inefficiencies and take actions in customer service. For businesses that must digitally transform, data, insight and actions are invaluable, both before, during and after the transformation process.

Delivering a frictionless future

The banking and financial services industries must leverage data and insights generated by processes to reduce friction so that their businesses can perform at maximum efficiency levels.

Process mining and execution management offer key data, insights and action advantages while reducing risks and quickly providing value in days and weeks rather than months.

Process mining empowers business leaders with the process visibility and insights, and execution management delivers the actions they must take to deliver truly frictionless digital transformation initiatives. Its the key to making banking and financial services businesses perform at levels they never thought possible.

Nick Mitchell is vice president and country manager UK&I at Celonis

Read more from the original source:

Process mining: Digital transformation lynchpin in banking & finance ERP Today - ERP Today

Read More..

Why cobalt mining has resumed in the U.S. after 30 years – Automotive News

Booming demand for batteries powering the worlds shift into electric vehicles is rekindling U.S. cobalt production after at least a 30-year hiatus.

Australia-based Jervois Global Ltd. is starting the first U.S. cobalt mine in Idaho on Friday, according to CEO Bryce Crocker. The mineral sits at the top of the table in terms of national security, said Crocker.

There arent many new sources of supply, particularly in stable jurisdictions, which is why this mine in the US is very important, he said. Cobalt hasnt been produced in the US since at least 1994, according to data from the United States Geological Survey.

Cobalt is a crucial component in EV batteries and is on the U.S. governments critical-minerals list. The U.S. sees widespread adoption of EVs as key to its efforts to combat climate change.

Both California and New York have passed laws that will ban the sale of new gasoline-powered vehicles in the coming decades. As automakers gear up to achieve ambitious electrification goals, its causing a shortage of materials needed in batteries and sparked a global rush to secure those supplies.

The level of urgency among manufacturers to secure supplies is profoundly different than it was even two or three years ago, said Crocker in an interview. Its now very elevated in terms of focus at the director and the board level, he said.

In July, General Motors and Ford Motor Co. stepped up efforts to lock in supplies by signing direct deals with producers of battery metals.

While more than two-thirds of the mined metal comes from the Democratic Republic of Congo, theres been an increasing shift among manufacturers to source cobalt from outside the African nation due to allegations of corruption, human rights abuses and the use of child labor there.

The passage of the Inflation Reduction Act also provides incentives for battery materials sourced in the U.S. EVs can qualify for a $7,500 tax credit under President Joe Bidens climate and tax bill, as long as their batteries contain minerals extracted from or processed in a country with a free trade agreement with the U.S., and providing part of the components are made or assembled in North America.

The Idaho mine is expected to produce 2,000 tons of mined cobalt a year, according to Crocker. The concentrated cobalt will then be exported and converted into refined products outside the U.S. before ultimately brought back into the U.S. to serve customers, he added.

Jervois owns a nickel and cobalt refinery in Brazil and is talking to third parties in countries such as Canada and Australia to convert the mined material. About 80 percent of global refining is concentrated in China, but capacity is growing elsewhere, including at Finlands giant Kokkola refinery, which is owned by Jervois.

Cobalt demand will grow from 127,500 tons in 2022 to 156,000 tons in 2030, as a result of the shift to iron-based batteries by major automakers, according to BloombergNEF.

Visit link:

Why cobalt mining has resumed in the U.S. after 30 years - Automotive News

Read More..

How Will the Metaverse Affect Cloud Security? – Trend Micro

An immersive digital world enabled by a range of technologies, including the internet of things (IoT), blockchain, and virtual and augmented reality, the metaverse allows us to see and interact with objects and people. This virtual environment is enhanced by photorealistic avatars that can reproduce your real body through wearable sensors that measure your movements and immersive smart glasses that enable virtual and augmented reality. With these technologies, what you do in the real world controls your experience in the virtual world and vice versa.

Supporting a virtual universe requires vast computing and storage resources. These resources are readily available in the cloud. This predicted uptake of cloud services should lead to purpose-built cloud technologies purpose-built to serve the needs of the metaverse.

As the cloud forms the basis of the metaverse, in what ways will the metaverse affect cloud security?

Top Metaverse security concerns

For a virtual world to operate like the physical world, it must sustain continuous online availability with real-time feedback and continuous operation. High-scale interactions are supported by high-speed information transmission and computing systems. The ideal compute infrastructure for the metaverse supports low latency and big data flows.

Technologies such as cloud computing, 5G, IoT, edge computing, and high-performance computing are ideal for supporting metaverse computing and processing requirements. Adopting these technologies in the metaverse will require more devices connected to the cloud and an increase in cloud infrastructure. Looking at this expansion from a security perspective, an increase in endpoints connected to the cloud will undoubtedly lead to an overall increase in the exposed attack surface.

For example, IoT devices are highly targeted vulnerability points for attackers. This is because they commonly contain weak security controls and portabilitya recipe for infiltrating multiple networks. IoT botnets are not uncommon occurrences, which might be replicated in the metaverse. Attackers target botnets as they allow them to automatically distribute malware, slow down compute power by mining for cryptocurrency, compromise data, and crash servers through DDoS attacks.

The metaverse is tied to the blockchain, which is the primary medium for allowing the trading of digital commodities in this virtual world. Non-fungible tokens (NFTs) are unique cryptographic assets representing physical or digital items as a record on a blockchain. These often-collectible digital assets hold value in a similar way to physical possessions.

As blockchain is possibly the most popular form of payment in the metaverse world. This leaves its impact on cloud security as an area of concern. NFTs are vulnerable to security breaches, allowing users to access tokens and identities as well as conduct illegal transactions. Authentication loopholes may allow an attacker to obtain illegal ownership of an NFT, or an attacker could interfere with NFT media data and metadata to manipulate transactions.

As some favor the decentralized and inexpensive nature of blockchain storage, its up to cloud providers to take a closer look at how their enterprise infrastructure and services relate to these blockchains. The key to this consideration is enhancing the security of keys and associated blockchains.

In addition, this data can be enhanced through access control and authentication mechanisms that promote user data privacy. In the metaverse, hash functions and asymmetric-key encryption help ensure data security. AR and VR systems in the metaverse share a large portion of data. This means cloud providers must ensure secure and seamless data sharing. Finally, blockchain features data encoding capabilities that cloud providers can leverage.

Although the concept of compromised identities is not new to IT security, it has, however, been largely overlooked in virtual worlds and other online environments. The rise of the internet age has made identity theft easier for attackers to execute, but digital identity theft can have far more impact when applied to the metaverse.

For instance, digital identity theft can allow bad actors to access valuable data and take control of assets stored in the metaverse. A thief could spoof your identity, hack your accounts, and take over your avatar. The impact is not only financial, as these cybercriminals using your digital persona possess the power to either purposefully or incidentally ruin your reputation. The anonymity of the metaverse may lead attackers to feel protected and are given the confidence to expand upon such actions.

Fortunately, the metaverse will require an identity and authentication mechanism to secure digital identities. To mitigate cyber risk amongst users, identity verification systems must evolve to match the changing cyber landscape and prevent account takeover in Web3. That said, the metaverse itself offers many promising solutions for addressing digital identity theft challenges.

For instance, using virtual reality (VR) or augmented reality (AR) glasses or headsets in the metaverse opens up an opportunity to develop new authentication tools and mechanisms. VR sensors could be configured to provide cutting-edge, uniquely identifying biometric systems such as body motions, hand motions, and gestures.

The metaverse will store a wealth of user data. This includes information on how these consumers interact with the metaverse as well as personal information derived from AR, VR, and IoT devices. As these new devices become available, so do the opportunities for attackers to gain access to valuable information on individuals who may not be aware theyre being tracked by their phone or an IoT device.

New types of sensors in the metaverse allow devices to collect more information than before. This includes biometric data such as fingerprints, retina scans, or voice patterns and audio recordings based on conversationsas well as smartwatches, which can track blood pressure, heart rate, and body temperature. This surge in the amount of sensitive personal data transmitted to and from the cloud will require metaverse vendors leveraging the cloud to look closely at how they manage, share, and store information.

Regardless of how regulators govern the handling of personal data in the metaverse, the techniques used to protect personally identifiable information in conventional cloud environments cannot be ignored. Consumers and organizations hosting metaverse entities should consider a hybrid cloud environment to improve privacy, reliability, scalability, and security.

A hybrid cloud solution adopts a separate yet connected architecture comprised of on-premise, private, and public environments. Youre given the option to store sensitive data or run sensitive workloads on private servers. Encrypted APIs facilitate the security of workloads and data in transit between data centers and cloud environments. To minimize data exposure, it is recommended to host sensitive workloads in the private cloud and less sensitive workloads in the public cloud.

More connections, more challenges

The metaverse will be proven an essential technology for a number of sectors and industries. However, much like any new technological advance, security challenges will undoubtedly arise, namely the impact of the metaverse on cloud platforms.

The metaverse will lead to a connection of more devices in the cloud, expanding the digital attack surface for organizations and individuals. The use of blockchain as the inherent medium of exchange in the virtual world also raises questions about security in the cloud. Identity theft and theft of personally identifiable information in the metaverse remain critical areas of concern.

Read more here:
How Will the Metaverse Affect Cloud Security? - Trend Micro

Read More..

Newly launched Cloud-shaped Internet Hosting by cdmon, the future of Hosting – Digital Journal

cdmon wants to present its new infrastructure, the most innovative Cloud in Europe, using the newest technology to provide an excellent service to their customers

cdmon wants to provide quality innovation in a reasonable, transparent, and cordial way to its customers, so therefore it has created the fastest Cloud Hosting in Europe. cdmon has developed this new project with a changed infrastructure thanks to its platform entirely based on Intel Optane SSD and NVMe (Non-Volatile Memory express) SSD disks. This means that its Cloud is 10x faster than the ones based on normal SSD disks, making it the fastest and most secure Cloud in all of Europe. Only the best for its customers.

This newly developed technology gives cdmons customers the best shot at being successful in their projects. Its team of experts, obsessed with innovation and learning about up-and-coming technologies, is continually learning so they can consistently provide the latest technologies to their customers, and offer services of the highest quality. This also applies to cdmons customer care service, the best rated in Spain, available 24/7 to help their customers solve all their doubts and get the perfect product for their project.

But even though having the fastest Cloud Hosting platform in Europe is very important, it is only a small portion of what cdmon can offer. cdmons focus is on their customers, making it possible that they can change their lives, do their projects, and expand. For this reason, it wants to give their customers the best service and an exceptional performance so they can make their projects soar.

cdmon invites technology enthusiasts who are interested in transforming their lives and look forward to changing the world. For this reason, it offers the best quality and security on all its products. And all its products offer so much more

All its hostings include benefits that you wont find anywhere else: from wildcard and multidomain SSL certificates to daily backups that any customer will be able to restore from its Control Panel. cdmon believes in its products so much that there is no fixed term contract: once a customer sees all that cdmons hosting provides, they wont want to leave.

But who is cdmon?

It is a Spanish-based company formed in 2002 that has become the leading company as a hosting and domain provider. Its headquarters are located in Malgrat de Mar, but the team can be found spread throughout the Peninsula and Europe. cdmon wants to create an open and quality Internet where everyone can fit, and it wants to do that by focusing on its customers projects and giving them the best services. Discover all you can do with the best and fastest Cloud-hosting and join the more than 200,000 projects that have relied on cdmon throughout the last 20 years.

Media ContactCompany Name: CdmonEmail: Send EmailCountry: SpainWebsite: https://www.cdmon.com/en/

The rest is here:
Newly launched Cloud-shaped Internet Hosting by cdmon, the future of Hosting - Digital Journal

Read More..

Telecom Cloud Market to Hit $103.6 Billion by 2030: Grand View Research, Inc. – Benzinga

SAN FRANCISCO, Oct. 6, 2022 /PRNewswire/ --The global telecom cloud market size is expected to reach USD 103.6 billion by 2030, according to a new report by Grand View Research, Inc. The market is anticipated to expand at a CAGR of 19.9% from 2022 to 2030. A telecom cloud is a next-generation network architecture that integrates cloud-native technologies, network function virtualization, and software-defined networking into a distributed computing network. Orchestration and automation are essential since the computing and network resources are scattered across clouds and locations.

Key Industry Insights & Findings from the report:

Read 120-page full market research report, "Telecom Cloud Market Size, Share & Trends Analysis Report By Component (Solution, Services), By Deployment Type, By Service Model, By Application, By Enterprise Size, By Region, And Segment Forecasts, 2022 - 2030", published by Grand View Research.

Telecom Cloud Market Growth & Trends

Telco Cloud refers to shifting communications service providers (CSPs) from vertically integrating proprietary hardware-based infrastructure networks to cloud-based technologies. It is mainly used in the telecom business to refer to multi-cloud computing. The propelling drivers of the telecom industry are increased customer satisfaction, corporate agility, cost savings, and others. Also, the usage of standard computational hardware and automation reduces CapEx and OpEx resulting in increased adoption of telco cloud in the telecommunication industry.

It also delivers innovative bespoke B2B solutions, such as telcos may bring highly customized corporate products to market rapidly and affordably. Telco cloud makes it simple to collaborate with business service partners by providing access to public cloud services from any device, at any time. Additionally, it protects your consumers and profits from competitors; for instance, the telco cloud enables operators to swiftly alter business models to test new goods, services, and pricing schemes.

It also makes setting up new consumer experiences and communication channels easier. Furthermore, the lower CapEX and OPEX needs of telco cloud, better service resilience, and capacity to respond swiftly to faults and demand changes allow operators to maintain service levels and competitive pricing. These advantages result in lower client attrition.

The top trends in the telecom cloud industry are hybrid cloud hosting, Cloud Native Network Functions (CNNF), and telecom cloud collaboration. A hybrid cloud merges private and public clouds where the software and data are interoperable and portable. It allows telcos to optimize the operations with various patterns to manage workload. It improves resource allocation, optimizes infrastructure spending, provides enhanced organizational agility, and offers the ability to scale using the public cloud and controls available in the private cloud deployment.

Also, in the case of CNNF, Software-defined networking is replaced by NFV (Network Functions Virtualization), which provides more independence from proprietary servers and hardware. It provides a cloud-native architecture that combines VNFs and CNFs while adopting 5G features. This will provide maximum market coverage to telecom businesses looking to expand their services. Moreover, telecom cloud collaboration includes partnerships between hyperscalers and telcos which constitute a major cloud computing trend transforming the business.

Cloud service providers and telecom enterprises join forces to expand edge computing collaboration and 5G. Telecom cloud service providers are increasing their connectivity with the help of technology advancement to gain a competitive edge over their peers and capture a significant market share.

Telecom Cloud Market Segmentation

Grand View Research has segmented the global telecom cloud market based on component, deployment type, service model, application, enterprise size, and region:

TelecomCloud Market - Component Outlook (Revenue, USD Billion, 2017 - 2030)

Telecom Cloud Market - Deployment Type Outlook (Revenue, USD Billion, 2017 - 2030)

Telecom Cloud Market - Service Model Outlook (Revenue, USD Billion, 2017 - 2030)

Telecom Cloud Market - Application Outlook (Revenue, USD Billion, 2017 - 2030)

Telecom Cloud Market - Enterprise Size Outlook (Revenue, USD Billion, 2017 - 2030)

Telecom Cloud Market - Regional Outlook (Revenue, USD Billion, 2017 - 2030)

List of Key Players of Telecom Cloud Market

Check out more related studies published by Grand View Research:

Browse through Grand View Research's Communications Infrastructure IndustryResearch Reports.

About Grand View Research

Grand View Research, U.S.-based market research and consulting company, provides syndicated as well as customized research reports and consulting services. Registered in California and headquartered in San Francisco, the company comprises over 425 analysts and consultants, adding more than 1200 market research reports to its vast database each year. These reports offer in-depth analysis on 46 industries across 25 major countries worldwide. With the help of an interactive market intelligence platform, Grand View Research Helps Fortune 500 companies and renowned academic institutes understand the global and regional business environment and gauge the opportunities that lie ahead.

Contact:

Sherry JamesCorporate Sales Specialist, USAGrand View Research, Inc.Phone: 1-415-349-0058Toll Free: 1-888-202-9519Email: sales@grandviewresearch.comWeb: https://www.grandviewresearch.comGrand View Compass| Astra ESG SolutionsFollow Us: LinkedIn | Twitter

Logo: https://mma.prnewswire.com/media/661327/Grand_View_Research_Logo.jpg

SOURCE Grand View Research, Inc

Continue reading here:
Telecom Cloud Market to Hit $103.6 Billion by 2030: Grand View Research, Inc. - Benzinga

Read More..

IoT harmony? What Matter and Thread really mean for your smart home – Ars Technica

Enlarge / Matter promises to make smart home devices work with any control system you want to use, securely. This marketing image also seems to promise an intriguing future involving smart mid-century modern chairs and smart statement globes.

CSA

The specification for Matter 1.0 was released on Tuesdayall 899 pages of it. More importantly, smart home manufacturers and software makers can now apply for this cross-compatibility standard, have their products certified for it, and release them. What does that mean for you, the person who actually buys and deals with this stuff?

At the moment, not much. If you have smart home devices set up, some of them might start working with Matter soon, either through firmware upgrades to devices or hubs. If you're deciding whether to buy something now, you might want to wait to see if it's slated to work with Matter. The first devices with a Matter logo on the box could appear in as little as a month. Amazon, Google, Apple, and Samsung's SmartThings division have all said they're ready to update their core products with Matter compatibility when they can.

That's how Matter will arrive, but what does Matter do? You have questions, and we've got... well, not definitive answers, but information and scenarios. This is a gigantic standards working group trying to keep things moving across both the world's largest multinational companies and esoteric manufacturers of tiny circuit boards. It's a whole thing. But we'll try to answer some self-directed questions to provide some clarity.

CSA

What is Matter? Where did it come from?

Matter is maintained by the Connectivity Standards Alliance (CSA), which was previously known as the ZigBee Alliance. ZigBee is an IEEE 802.15.4 specification for a low-power, low-data-rate mesh network that is already in use by Phillips' Hue bulbs and hubs, Amazon's Echo and Eero devices, Samsung's SmartThings, Yale smart locks, and many smaller devices. It had pretty good buy-in from manufacturers, and it proved the value of mesh networking.

Starting with that foundation, the CSA somehow built up momentum to push for something people want more than an iterative networking standard: a guarantee that if they buy, or develop, a smart home device, they won't have to figure out which corporate allegiances that device can work with. The mission was to "simplify development for manufacturers and increase compatibility for consumers," the ZigBee Alliance said, and the new standard was called CHIP, or "Connected Home over IP."

That standard was renamed Matter, then delayed, more than once. Stacey Higginbotham, a reporter focused on IoT, cited the COVID-19 pandemic and the group's rapidly scaling size for its earliest delays. This week, with 550 members of the CSA involved in Matter standards development and a "fall 2022" release target arriving, Higginbotham heard from insiders that the Matter group felt pressured to release something, even if it was scaled back from its original promises. And as you might imagine, a lot of bugs and questions come up when more than 250 previously siloed companies start working together on something.

So Matter is just a new ZigBee with more corporate buy-in?

No, Matter is an interoperability standard, with many connection options available to devices. Under Matter, devices can talk to each other over standard Wi-Fi, Ethernet, Bluetooth Low-Energy, or Thread, another IEEE 802.15.4 standard (we'll get to Thread a bit later).

If you have an extensive network already set up with ZigBee or Z-Wave, it might still fit into a Matter network. Hub makers are gradually announcing firmware updates to allow for Matter compatibility, allowing them to serve as a bridge between their mesh and Matter-ready controllers and devices. Before it rebranded as the CSA, the ZigBee Alliance announced that it would work with the Thread Group to create compatible application layers.

Originally posted here:
IoT harmony? What Matter and Thread really mean for your smart home - Ars Technica

Read More..

The Biden administration issues sweeping new rules on chip-tech exports to China – Protocol

"I would say down the road, we will be known for more than just security. And we're starting to see that today," Kurtz said.

CrowdStrike brings plenty of credibility from its work in cybersecurity to its effort to penetrate the broader IT space, according to equity research analysts who spoke with Protocol. The company recently disclosed surpassing $2 billion in annual recurring revenue, just 18 months after reaching $1 billion. And even with CrowdStrikes scale, it's continued to generate revenue growth in the vicinity of 60% year-over-year in recent quarters.

In a highly fragmented market like cybersecurity, this type of traction for a vendor is unique, said Joshua Tilton, senior vice president for equity research at Wolfe Research. "They're sustaining [rapid] growth and profitability, which is very rare in this space."

At the root of CrowdStrike's surge in adoption is its cloud-native software platform, which allows security teams to easily introduce new capabilities without needing to install another piece of software on user devices or operate an additional product with a separate interface. Instead, CrowdStrike provides a single interface for all of its services and requires just one software agent to be installed on end-user devices.

As a result, CrowdStrike can tell existing customers who are considering a new capability, You already have our agent turn it on, try it out, Kurtz said. And if you like it, keep it on. It's that easy.

For years, Kurtz has touted the potential for CrowdStrike to serve as the "Salesforce of security" thanks to this cloud-based platform strategy. But at a time when cybersecurity teams are looking to consolidate on fewer vendors and are short on the staff needed to operate tools, CrowdStrike's approach is increasingly resonating with customers, analysts told Protocol.

The company has now expanded well beyond endpoint detection and response, a category it pioneered to improve detection of malicious activity and attacks (such as ransomware and other malware) on devices such as PCs. Along with endpoint protection, CrowdStrike now offers security across cloud workloads, identity credentials, and security and IT operations.

The cloud-native platform concept is still early on for cybersecurity, but if CrowdStrike's momentum continues, it's poised to potentially become the first "fully integrated, software-based platform" in the security industry, Tilton said. That's in contrast to other platform security vendors that are hampered by architectures that predated the cloud, or that rely on hardware for some of their functionality.

"CrowdStrike's DNA is that they've come as a cloud-native company with a focus on security from day one," said Shaul Eyal, managing director at Cowen. "It does provide them with an edge."

Even with CrowdStrikes advantages, there are no guarantees it will maintain a leading position in a market as large and competitive as endpoint security. There, the company faces a fierce challenge from Microsoft and its Defender product. Its a topic that Kurtz is outspoken as ever about.

In regards to Microsoft, "if you are coming out with zero-day vulnerabilities on a weekly basis, which are being exploited, that doesn't build trust with customers," Kurtz said.

"I'm not saying they're not going to win deals. Because they're Microsoft, sure, they're going to win some deals," he said. "But we do see deals boomerang back our way when someone has an issue. Many of the breaches that we actually respond to [are for customers with] Microsoft endpoint technologies in use."

Even so, Microsoft brings plenty of advantages of its own in terms of its security approach, analysts told Protocol. Much of the business world counts itself as part of the Microsoft customer base already, and the company has seen major success in bundling its Defender security product into its higher-tier Office 365 productivity suite, known as E5. As of Microsoft's quarter that ended June 30, seats in Office 365 E5 climbed 60% year-over-year, the company reported.

And for every CISO who thinks it doesn't make sense to trust Microsoft on security due to vulnerabilities in its software products, there is another CISO who thinks Microsoft's ubiquity in IT is exactly why the tech giant is worth leveraging for security, Tilton said.

Beyond the successful bundling strategy, Microsoft has overall done "an exceptional job of elevating security within their product portfolio," said Gregg Moskowitz, managing director and senior enterprise software analyst at Mizuho Securities USA.

Still, "we do typically hear that Microsoft has limitations when it comes to what an enterprise's requirements are across some of these cybersecurity areas," including on endpoint, Moskowitz said. At the same time, "we do believe Microsoft's going to get a lot stronger over time," he said.

IDC figures have shown CrowdStrike in the lead on endpoint security market share, with 12.6% of the market in 2021, compared to 11.2% for Microsoft. CrowdStrike's growth of 68% in the market last year, however, was surpassed by Microsoft's growth of nearly 82%, according to the IDC figures.

Still, Kurtz argued that CrowdStrike has the leg up in endpoint for plenty of other reasons beyond the lack of the same security baggage via vulnerability issues at Microsoft.

The chief advantage goes back to CrowdStrike's single-agent architecture, which he said requires fewer staff to operate and has a lower impact on user devices. That translates to better performance and less use of memory because the product does not rely on analyzing digital patterns, known as signatures, for signs of an attack.

I would say down the road, we will be known for more than just security. And we're starting to see that today.

All of these factors need to be considered when doing the math around how much it will cost to implement an endpoint security product into an operation, Kurtz said. Based on that math, "we are significantly cheaper to operationalize than Microsoft," he said.

CrowdStrike has particularly stood out with customers when it comes to the lower performance impact from its Falcon product line, said John Aplin, an executive security adviser at IT services provider World Wide Technology.

The company recently worked with one of the largest U.S. banks to select a new endpoint security product, and the choice came down to CrowdStrike or Microsoft Defender, he said. While the bank was initially tempted to utilize its E5 licensing and go with Defender, Aplin said, extensive testing revealed Falcon's comparatively lighter-weight impact on devices, prompting the customer to pick CrowdStrike.

Performance impact is not a trivial thing when customers are often running 40 to 70 different security tools, he said. So while being able to provide reliable security is obviously important, the "operational effectiveness" in areas such as performance impact on devices is "where CrowdStrike always wins," he said.

The reputation for trustworthy security that CrowdStrike has built since its founding in 2011 shouldn't be minimized as a factor either, according to Wolfe Research's Tilton.

By and large, CISOs make purchasing decisions "based on the amount of minutes of sleep at night" they expect to get from a product, he said. CrowdStrike's "first-mover" advantage in endpoint detection and response is a huge one, and its brand awareness is virtually unmatched in security, probably on par only with that of Palo Alto Networks, Tilton said.

While some smaller challengers, chiefly SentinelOne, have made headway in the endpoint security space, they have an uphill battle, he said. In endpoint security, "the CISO has to have a good reason to not buy CrowdStrike."

In categories outside of endpoint security, CrowdStrike doesn't yet enjoy the same stature. But in some areas, such as identity security, it's on track to get there quickly.

Misuse of credentials has emerged as the biggest source of breaches by far as workers have moved outside of the protections of the office firewall, according to Verizon. While CrowdStrike isn't trying to compete with identity management vendors such as Okta or Ping Identity, the company does believe it's found a sweet spot in helping customers to counter identity-based threats, Kurtz said.

Following its fall 2020 acquisition of identity security vendor Preempt Security, CrowdStrike has added identity protection and detection capabilities to its platform, and customer adoption has been "like a rocket ship," Kurtz said. During CrowdStrikes fiscal second quarter, ended July 31, customer subscriptions to the company's identity protection module doubled from the previous quarter.

That's a "stunning level of adoption from customers," Mizuho's Moskowitz said. Given that CrowdStrike paid $96 million for Preempt, "that's clearly one of the best small to midsize acquisitions that weve seen in software in recent years," he said.

CrowdStrike refers to its various add-on security capabilities as modules, and currently has 22 in total, up from 11 in late 2019. A forthcoming module based on the companys planned acquisition of startup Reposify will be aimed at spotting exposed internet assets for customers, bringing CrowdStrike into the very buzzy market for external attack surface management.

Besides identity protection, the companys other fastest-growing module at the moment is data observability, based on its early 2021 acquisition of Humio, which was recently rebranded to Falcon LogScale. And while highly applicable to security, observability focuses on tracking and assessing many types of IT data. Observability enables customers to "do things that are not just security-related," Kurtz said, such as deploying software patches and taking other actions to improve IT hygiene.

George Kurtz, CEO of CrowdStrike. Photo: Michael Short/Bloomberg via Getty Images

In total, CrowdStrike reported that it was generating $2.14 billion in annual recurring revenue as of its latest quarter, with its "emerging products" category contributing $219 million. ARR for those emerging products which include identity protection and observability, but not more-established areas for CrowdStrike, such as workload protection surged 129% from the same period a year before.

Looking ahead, "we'll continue to solve problems that are outside of core endpoint protection and workload protection, but are related, in the IT world," Kurtz said.

Even within cybersecurity itself, CrowdStrike's emphasis on observability "shows that the industry is starting to recognize that cybersecurity is a data problem," said Deepak Jeevankumar, a managing director at Dell Technologies Capital, who had led an investment by the firm into Humio.

CrowdStrike has no ambitions to get into areas such as network or email security, Kurtz noted. But if a certain business challenge involves collecting and evaluating data from endpoints or workloads, whether that's IT or security data, "we can do that," he said.

Application security is another future area of interest, Kurtz said. Given the criticality of many business applications, "understanding their security, who's using them, how they're being used that's important for organizations of many sizes to have that level of visibility and protection."

Within security, CrowdStrike is also notably embracing an approach that's come to be known as extended detection and response, or XDR, for correlating data feeds from a variety of different security tools. CrowdStrike's XDR approach taps into data both from its own products and from third-party tools, including vendors in its CrowdXDR Alliance that have technical integrations with CrowdStrike.

While XDR is no doubt an industry buzzword, it's the most effective way yet to put the pieces together and understand how a cyberattack occurred, Kurtz said. "Before XDR, we were sort of blind to how [an attacker] got to the endpoint," he said. "Now were able to tell the whole story."

CrowdStrike offers a number of managed security services as well, which the vendor was quick to recognize as an important option amid the cybersecurity talent shortage, according to Peter Firstbrook, vice president and analyst at Gartner.

CrowdStrike actually perfected this, Firstbrook said. They ran into this roadblock early. Customers said, Look, this [technology] is really cool. But we don't have anybody that can manage it.

Ultimately, CrowdStrike is well positioned at a time when CISOs are fed up with going to dozens of different vendors to meet their security needs, Cowen's Eyal said. The current refrain from CISOs is, "'We want to deal with the Costco or the Walmart, the big supermarket, for all of our security needs,'" he said. In that respect, "the platform approach is absolutely going to be benefiting [vendors] like CrowdStrike."

Over the years, Kurtz said he hasn't backed away from comparing CrowdStrike with Salesforce for a good reason: It's a meaningful comparison, which has only gotten more so as time has gone on.

"I've said this since I started the company, that we wanted to be that 'Salesforce of security' to have a true cloud platform that would allow customers to do more things with a single-agent architecture," he said. "We haven't really deviated from that."

See the rest here:
The Biden administration issues sweeping new rules on chip-tech exports to China - Protocol

Read More..

BOB Recruitment 2022 for IT Professionals: Check Vacancies, Apply Online Till Oct 24 – StudyCafe

BOB Recruitment 2022 for IT Professionals: Check Vacancies, Apply Online Till Oct 24

BOB Recruitment 2022: Bank of Baroda (BOB) is looking for qualified candidates for Cloud Engineer, Application Architect, Enterprise Architect, Infrastructure Architect, Integration Expert and Technology Architect positions. The total No. of vacancies for these posts is 12. Interested candidates should review the job description and apply using the link provided in the official notification. Interested candidates who have B.E./ B.Tech. in Computer Science or Information Technology will be given preference. The last application submission date is 24.10.2022 (23:59 hours).

Candidates are requested to apply for the job post before the deadline. No application shall be entertained after the stipulated time/ date. Incomplete applications and applications received after the specified time/ date shall be REJECTED. All the details regarding this job post are given in this article such as BOB Recruitment 2022 official Notification, Age Limit, Eligibility Criteria & much more.

1. Cloud Engineer: Interested candidates who have B.E./ B.Tech. in Computer Science or Information Technology will be given preference. Minimum 10 years of Technical and IT experience out of which at least 5 years experience in the field of cloud computing.

2. Application Architect: B.E./ B.Tech. in Computer Science or Information Technology. The interested candidate should have a minimum of 10 years of Technical and IT experience out of which at least 5 years of experience as an Application Architect. Experience as Application Architect in Alternate Delivery Channels (eg: CBS, LOS, LMS, etc.) will be preferred. Experience in AGILE Methodology/Core JAVA/LINUX/UNIX Server preferred

3. Enterprise Architect: B.E./ B.Tech. in Computer Science or Information Technology. Minimum Experience Minimum 10 years of Technical and IT experience out of which at least 5 years experience in architecting, designing and managing banking platforms.

4. Infrastructure Architect: B.E./ B.Tech. in Computer Science or Information Technology. Minimum 10 years of Technical and IT experience out of which at least 5 years experience in architecting, designing and managing banking platforms.

5. Integration Expert: B.E./ B. Tech. in Computer Science or Information Technology Candidates with Professional certifications in OS (Unix/Linux), Middleware, Storage, and Load Balancer will be preferred. Minimum 10 years of Technical and IT experience out of which at least 5 years experience in designing and building large IT infrastructure projects.

6. Technology Architect: B.E./ B.Tech. in Computer Science or Information Technology. Minimum 10 years of Technical and IT experience out of which at least 5 years experience in the integration process of banking platforms.

The minimum age limit to apply for this job recruitment is 32 years maximum age limit is 45 years.

Remuneration will be offered based on the candidates qualifications, experience, overall suitability, last drawn salary of the candidate and market benchmarks for the respective posts, and shall not be a limiting factor for suitable candidates.

1. Cloud Engineer:

a. Design, implement and manage secure, scalable, and reliable cloud infrastructure environments.

b. Propose and implement cloud infrastructure transformation to modern technologies and methods used to run microservices application architectures.

c. Building, troubleshooting, and optimizing container-based cloud infrastructure.

d. Ensure operational readiness for launching secure and scalable workloads into public and hybrid cloud environments.

e. Validate existing infrastructure security, performance and availability and make recommendations for improvements and optimization.

f. Ensure Backups, resilience, and business continuity.

g. Implement infrastructure best practices.

2. Application Architect:

a. Design and validate application architecture design and other technology architecture.

b. Estimate design efforts, define detailed schedules, evaluate technologies, develop prototypes, and architect design.

c. Change application Architecture as per business needs and Technology changes.

d. Understand and apply architect principles, processes, standards and guidelines.

e. Understand, document, and monitor application layering dependencies (User-Interface, Deployment, Public Interface, Application Domain, Application Infrastructure, Technical Frameworks, and Platforms) and application component dependencies.

f. Document and maintain context diagrams, functional architectures, data architecture, and messaging architecture diagrams and descriptions.

g. Understand and monitor impacts to and dependencies between existing technical and network environments and etc.

3. Enterprise Architect:

a. Set up technical standards and governance structure for the enterprise.

b. Assist business strategy and accordingly drive technology strategy from an architecture perspective.

c. To Provide technology architecture expertise and guidance across multiple business divisions & technology domains

d. Setting up technical standards, formulation of Enterprise Architecture (EA) Governance Framework.13

f. Driving technology strategy from an architecture perspective, across a portfolio of applications in the Bank, for resource optimization and Risk mitigation.

g. Translating business requirements into specific system, application, or process designs, including working with business personnel and executives to identify functional requirements.

h. Define/ maintain Target Architectures in Roadmaps.

i. Lead and/or assist efforts to scope and architect major change programs, leading strategic options analysis & proposing end-to-end solutions & highlighting trade-offs.

j. Review ongoing designs of major programs to identify strategic opportunities and resolve design issues during delivery.

4. Infrastructure Architect:

a. Assist the development of the overall technology strategy with a critical focus on enterprise and platform architecture.

b. Responsible for the design of systems and interfaces both internal and external.

c. Identifying and integrating overall integration points in the context of a project as well as other applications in the environment.

d. Defining guidelines and benchmarks for non-functional requirement considerations during project implementation.

e. Review architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, non-functional requirements, etc., against a predefined checklist and ensure that all relevant best practices are followed.

f. Providing a solution to any issue that is raised during code/design review and justifying the decision taken.

5. Integration Expert:

a. Designing, articulating and implementing architectural scalability.

b. Work in close collaboration with the application architect to ensure optimal infrastructure design.

c. Draw a long-term enterprise-level IT Infrastructure Plan.

d. Ensure that availability requirements are met in the design.

e. Validate all Infrastructure Changes and obtain necessary approvals from the competent authority.

f. Interact with IT Partners, Consultants and internal stakeholders.

g. Evaluate infra technology and industry trends, and identify the prospective impact on business.

h. Participate to develop and manage ongoing enterprise architecture governance structure on basis of business & IT strategies.

i. Promote organization architecture process and results to business and IT Departments.

j. Lead and direct to prepare governing principles to guide decision-making Equivalent to infrastructure architecture.

6. Technology Architect:

a. Collaborate on the successful integration of hardware, software and Internet resources.

b. Strong experience in Middleware and Infrastructure management.

c. Assist in planning and implementing a variety of technological opportunities.

d. Assist in the creation, maintenance, and integration of technology plans.

e. Ability to lead teams to successful end results

f. Strategic planning and continuous improvement mindset, relevant to technology processes and systems Assess technology skill levels of co-workers and customers.

1. Cloud Engineer:

a. Strong consulting experience with large-scale migrations to Cloud Providers such as Azure, AWS, Google, and IBM.

b. Knowledge of infrastructure solutions, platform migration, system security, and enterprise directories.

2. Application Architect:

a. Experience as Application Architect in Alternate Delivery Channels (eg: CBS, LOS, LMS etc.).

b. Experience in AGILE Methodology/Core JAVA/LINUX/UNIX Server preferred.

c. Deep understanding of cloud computing & in one or more of the following domains: Core Platform: Compute (Iaas & Paas), Storage, Networking.

3. Enterprise Architect:

a. Strong knowledge of enterprise architecture and design, including architecture frameworks such as TOGAF (TOGAF certification preferred).

b. Strong knowledge of technologies such as APIs, SOA, programming languages, cloud hosting practices and big data technologies.

4. Infrastructure Architect:

a. Overall understanding of banking technologies systems and processes. A track record of having successfully built innovations.

b. Implemented core banking. Delivery channels, payment systems and other digital banking solutions.

5. Integration Expert:

a. Experience in designing and building large IT infrastructure projects encompassing both Hardware, Virtualization and middleware layers.

b. Candidates with Professional certifications on OS (Unix/Linux), Middleware, Storage, and Load Balancer are preferred.

6. Technology Architect:

a. Understanding of IT architecture like SOA and integration methodologies like ESB and APIs.

b. Strong knowledge of development environments, middleware components, databases and open-source technologies.

c. Understanding of solutions for platform and application layers.

Interested candidates are advised to visit the Banks website http://www.bankofbaroda.co.in (Career Page Current Opportunities section) for further details or you may follow the following link for applying for the said post: The last date of submission of the application is 24.10.2022 (23:59 hours).

Step 1: Go to the BOB official website.

Step 2: Search for the BOB Recruitment 2022 Notification here.

Step 3: Read all of the information in the notification.

Step 4: Apply and submit the application form in accordance with the mode of application specified in the official notification.

To Read Official Notification Click Here

Disclaimer: The Recruitment Information provided above is for informational purposes only. The above Recruitment Information has been taken from the official site of the Organisation. We do not provide any Recruitment guarantee. Recruitment is to be done as per the official recruitment process of the company or organization posted the recruitment Vacancy. We dont charge any fee for providing this Job Information. Neither the Author nor Studycafe and its Affiliates accepts any liabilities for any loss or damage of any kind arising out of any information in this article nor for any actions taken in reliance thereon.

Here is the original post:
BOB Recruitment 2022 for IT Professionals: Check Vacancies, Apply Online Till Oct 24 - StudyCafe

Read More..

DoorDash Hacker Incident Illustrates Third-Party Vendor Risks and Potential Vulnerabilities – JD Supra

Hackers have increasingly focused on third-party vendors as avenues to data held by associated businesses. On August 25, 2022, DoorDash announced that it had experienced a data breach which impacted the personal information of certain customers and drivers. After detecting unusual activity originating from one of its third-party vendors, an investigation by DoorDash revealed that the vendor was the target of a phishing campaign. This comes just a few years after DoorDash customer data was breached in a similar hack in 2019, which was also linked to a third-party vendor. Unfortunately, DoorDash is not alone in experiencing the security risks linked to many third-party vendors.

Several companies have been exposed to data breaches by their third-party vendors in recent years. These hacks have resulted in lawsuits from consumers as well as government investigations. Failing to secure consumer data and monitor the cybersecurity practices of third-party vendors may open businesses up to state and federal enforcement actions.

Third-party vendors have significant access to the systems and data used by the companies that they work with. Many enterprises also contract with more than one third-party vendor, increasing the number of ways that information could be leaked. Hackers have learned to exploit this access by targeting the third-party vendors, who may have less stringent cybersecurity measures than associated businesses. Third-party vendors may be more vulnerable to phishing attacks, like the one used to breach DoorDash, in which hackers use compromised emails to gain access to sensitive data. They have also been the targets of increased ransomware efforts and attacks against outdated hosting services that leave information open for unauthorized use.

Many companies may not discuss data security policies with their third-party vendors, which means they could inadvertently be trusting their customers information with others who are not prepared to prevent breaches. While companies are focused on the security of their own networks, they should be aware that the vulnerabilities of their third-party vendors may pose an even greater risk to their customer data. Failing to assess and guard against these risks leaves businesses vulnerable to lawsuits from their consumers as well as government enforcement actions.

To minimize some of these risks, companies should prioritize cyber and data security when working with third-party vendors. Companies should ensure that any third-party vendor they contract with has a cybersecurity plan that includes regular testing of their protocols, documented efforts to fix any vulnerabilities, and communicating best practices with employees. Before agreeing to work with a vendor, businesses should ask how the vendor identifies data incidents and what their plan is to address any incident that may arise. Companies should also be sure to monitor what internal data each vendor has access to and consider whether the third-party vendors security policies are sufficient compared to their own policies. Access controls should be implemented to monitor third-party data usage and alert to any unauthorized access that might originate with a third-party vendor.

Contract language should also be drafted with data security in mind. To ensure fast and effective responses to cyber threats, third-party vendors should be obligated to report data breach incidents that they discover within a designated timeframe. Specific security requirements may also be established within a vendor contract. In the event that a data breach does occur, companies should consider adding an indemnity clause that would hold third-party vendors liable for any breach caused within their organization.

Bottom Line

Businesses should be aware of the cybersecurity risks associated with third-party vendors. When working with third-party vendors, companies should consider and assess the vendors security protocols. Both businesses and third-party vendors alike should invest in cyber insurance, and businesses should include strong indemnification language in their contracts with third-party vendors.

See the article here:
DoorDash Hacker Incident Illustrates Third-Party Vendor Risks and Potential Vulnerabilities - JD Supra

Read More..