Page 807«..1020..806807808809..820830..»

CCPA/CPRA Data Mapping: The Why, What, and How – JD Supra

How often does the word right show up in the text of the CCPA/CPRA?

Over 100 times.

Out of all those references to rights, it doesnt seem that the rights of businesses are often discussed. In the CPRA, consumers get all the rights, while the word businesses are most associated with is responsibility.

Businesses that are subject to the CPRA have responsibilities to their consumersresponsibilities to manage the proliferation of personal data across their organization, responsibilities to respond to consumer requests, responsibilities to protect consumer data, and more.

The only way to attend to those responsibilities is to know where you collect personal data, where you process it, where its sent, whether or not its adequately protected, and whether or not it's being treated compliantly.

In essence, if your business is subject to the CPRA, then it is imperative that you map your data and data processing activities. Well explain why and how in this article.

Like most data privacy regulations, the CPRA does not directly require you to map your organizations data. However, if you knowingly refuse to map where, how, and why your organization processes personal information, then any violations that take place associated with unmapped (and therefore unknown) personal information under your control could be construed as negligence.

If you dont map your organizations personal data processing activities, how will you:

Moreover, the CPRA not only requires you to manage the personal information you collect, but it also creates the concept of sensitive personal information.

Sensitive personal information includes data with the potential to cause harm to the associated consumer if it should be left unprotected, such as their medical information, social security number, sexual identity, and more. In order to apply the higher level of protection required by the CPRA to this information, youll need to engage in sensitive data discovery to identify where it lives and flows in your organization.

How do you actually approach mapping your organizations data in the context of the CPRA? There are a few different strategies, each of which will suit different kinds of organizations.

For very small organizations or organizations who know they have only a handful of essential systems to map, the manual approach can work.

Under this approach, youll develop spreadsheets that log all relevant compliance information associated with a given store of personal information, such as who owns or controls the systems, where the data is sourced from, where it is sent to, and so on.

Once your spreadsheet library is complete, you can simply contact the system owner to carry out any requisite tasks, such as fulfilling DSARs and auditing contracts for data processing addenda.

It doesnt take much to see the flaws in this approach, however; if you have any more than a handful of systems that process personal data, then the task of creating and maintaining a spreadsheet-based data map quickly becomes untenable. In fact, the average company uses 130 different SaaS applicationsmany, if not most, of those systems will be handling consumer data in some fashion.

Thats treating each system as equal, too. In reality, some systems will contain more or less personal information, sensitive personal information, subsystems, connected vendors, and so on.

Some organizations may have data science resources in place, whether thats a team of experts, a homegrown solution, or an off-the-shelf business intelligence tool. These businesses are in a better position to map their organizations data for CPRA compliance than those relying on the manual approachbut there are still issues to overcome.

For one, multipurpose data science resources will be in high demand. After all, data science falls under the broader umbrella of business intelligencecompliance isnt typically thought of as a business intelligence activity.Although a data science asset will technically be faster at CPRA data mapping than a manual approach, you may have to wait a long time before its your turn.

Then, there is also the likelihood that a homegrown approach to CPRA data mapping will still require a great deal of manual effort. Data science experts arent data privacy and compliance experts after all; theyre data science experts. A privacy professional will need to review the output and fill in the metadata necessary to make your data map actionable from a compliance perspective.

Given how essential data mapping is to an effective privacy program, there are data mapping solutions designed specifically for data privacy and compliance professionals. Osano Data Mapping is one such example.

Rather than rely on manual discovery or require data science expertise, Osano Data Mapping quickly uncovers systems that contain personal information by integrating with your Single Sign On (SSO) provider.

Based on criteria like the number and types of data fields, vendor flows, and identities managed, Osano Data Mapping assigns systems a risk score that enables privacy professionals to prioritize by risk and effort. Any systems that live outside of your SSO can be easily mapped using an automated workflow that keeps external stakeholders alert to any outstanding tasks.

The benefit of using a privacy-focused solution like Osano for CPRA data mapping is twofold:

Continue reading here:

CCPA/CPRA Data Mapping: The Why, What, and How - JD Supra

Read More..

Optimizing Resilience When Migrating Legacy Apps to the Cloud – ITPro Today

One of the principal advantages of platforms like SAP resides in their integrity, reliability, and resilience. These attributes garner considerable appeal for hosting mission-critical data and workloads within enterprise environments. Conversely, the cloud is lauded for its exceptional flexibility and scalability, with its cost-effectiveness rendering it a desirable alternative to traditional hosting platforms.

Consequently, enterprises find themselves confronted with a pertinent question: How can they amalgamate the resilience inherent in legacy application platforms with the multitude of benefits offered by cloud?

Related: What Is Cloud Storage? Definition, Types, and Benefits

This answer lies in meticulously migrating legacy workloads to the cloud, ensuring resilience is prioritized at every point of the process. As this article explains, integrating resilience into the migration process is not inherently organic. By implementing astute measures before, during, and after the cloud migration, enterprises can attain a balance of resilience and flexibility. Such a desirable equilibrium enables organizations much more opportunities for leveraging advantages from the cloud, whilst preserving the resilience of legacy platforms.

Before delving into best practices for building resilience during the migration of legacy applications, it is important to understand what is meant by resilience.

Related: Comparing Cloud Giants: 5 Key Differences Between AWS and Azure

At its core, resilience is the ability of workloads to remain available and perform regularly and successfully during strenuous operations, even when confronted with unexpected disruptions, like server failures or application crashes.

Legacy platforms are endowed with a myriad of intrinsic resilience features. These include the capacity to integrate redundancy into the infrastructure, thereby preventing the failure of one component from slowing the workload, and the aptitude to isolate applications to reduce the risk of a problem of one application's issues affecting others. Such features are integral to why enterprises have historically gravitated toward legacy application platforms for managing critical workloads.

On balance, modern public clouds possess resilience capabilities, too, like the option of mirroring workloads across multiple availability zones to augment their reliability. However, resilience features in the public cloud are usually not enabled by default, and many come with added costs and trade-offs.

Therefore, when you migrate legacy apps to the cloud, it is essential to undertake calculated measures to ensure resilience. Simply moving to the cloud and choosing the default configurations is no guarantee that your apps will prove resilient against various disruptions, such as VM (virtual machine) crashes or insufficient resource allocations during times of peak demand.

Building resilience into legacy applications when they're migrated to cloud requires action during each of the three stages of migration: during planning, during migration, and after the migration is complete.

In many ways, the pre-migration stage is paramount for purposes of resilience, given that the decisions made regarding the migration of the application to cloud and how to operate it once it is there bear significant implications for its resilience.

Start the planning process by assessing your application's current needs, inclusive of those related to resilience. For example, what extent of downtime can you tolerate from the app? Which types of processing power, storage, and other resources are requisite to operate it reliably? To what degree does the app load vary, and is this acceptable to your migration strategy?

Contingent on the responses to the queries mentioned above, choose an appropriate cloud architecture for hosting the app. Is the strategy to lift-and-shift the application into a VM instance on a public cloud service like Amazon EC2? Or will you refactor it to take advantage of microservices architectures and container-based cloud hosting services, such as Amazon EKS? Will you leverage options like deploying the app across multiple availability zones, acknowledging that while this enhances resilience, it typically incurs added costs?

Finally, upon determining how you'll run your app in the cloud and understanding the requirements for the migration, identify the necessary tools and processes that will enable the migration process. All major public cloud providers offer tools for lifting-and-shifting on-prem apps into their cloud environments. However, should there be plans to refactor your app or make major changes to its architecture, it may be necessary for a more direct approach.

Throughout the actual migration, your goal should be the assurance of resilience by monitoring the migration process continuously and maintaining preparedness to address any arising issues. If data that you transfer to the cloud fails to upload, for example, immediate notification will be imperative so that a restart of the transfer or a shift to alternative migration strategies will minimize unnecessary downtime (such as converting the data to a different format to work around transmission errors).

Likewise, be sure to have playbooks in place to facilitate quick responses in the event of complications. Successful migrations also need experienced staff who understand both legacy applications and cloud environments who can predict and minimize any unexpected issues that arise during migration.

Successful migration to cloud doesn't mean the resilience mission has been successfully completed. On the contrary, even if the cloud architecture you chose is inherently resilient, there are opportunities to enhance resilience further after the migration.

This is where practices such as ongoing workload monitoring and auditing become pivotal. These processes provide data that can be scrutinized to identify anticipated resilient levels. For example, monitoring might reveal a higher incidence of errors during peak loads than initially anticipated. Taking steps to mitigate those errors would likely improve application resiliency. Similarly, an audit might surface insecure configuration settings, potentially exposing the application to compromising attacks.

Equally important is ongoing application management and maintenance. Be sure that the application and the hosting cloud infrastructure are regularly patched, using tools such as AWS (Amazon Web Services) Systems Manager, Azure Update Management, and Google Cloud OS Config.

Finally, be sure to monitor cloud hosting costs to verify that the investment in resilience capabilities such as redundant VM instances is aligned with budgetary expectations.

Shifting applications to cloud doesn't automatically guarantee strong levels of resilience. The opposite can happen if you migrate an app from a legacy platform to a cloud without factoring in resilience requirements.

By putting resilience at the forefront during each phase of cloud migration before, during, and after you can achieve an app that maintains its robustness, all while reaping the benefits of the cloud's flexibility, scalability, and cost-efficiency. This balance is precisely what successful application modernization aims to accomplish.

About the author: Kausik Chaudhuri is Chief Innovation Officer at Lemongrass.

See original here:
Optimizing Resilience When Migrating Legacy Apps to the Cloud - ITPro Today

Read More..

Eufy is introducing cross-camera people tracking on its new security … – The Verge

Ankers smart home brand Eufy is revamping its camera lineup with a new SoloCam battery-powered outdoor camera ($199.99), Floodlight Cam E340($219.99), Video Doorbell E340 ($179.99), and Indoor Cam S350.

The signature hardware feature of the new line is its dual lenses with one camera thats wide angle and one telephoto for zooming in to better identify faces and license plates. But theres also a new standout ability: cross-camera tracking and video splicing.

Multiple cameras tracked this Amazon delivery worker from the vehicle to the gate, and Eufy outputs a spliced video for you to watch. Source: Eufy, GIF by Umar Shakir / The Verge

This technology is taken from the banking and traffic industry, Eufy spokesperson Brett White told The Verge in a briefing.

Each camera can pick up motion and follow it so you now get one spliced video showing every event across each camera. The new tracking feature works with Eufys HomeBase 3, a hub for its cameras. The HomeBase uses on-device AI to identify a person on each camera connected to it, splice all the videos together, and then send just one notification and one video.

The Eufy HomeBase 3 ($149.99) also adds AI-powered smart alerts to the cameras for free, including facial recognition and person, pet, and vehicle detection. The HomeBase 3 now also comes with a 1TB hard drive. Locally stored recordings are accessible for free with no subscription, and paid cloud storage is also available.

On-device AI and locally stored recordings are features that should appeal to privacy-conscious consumers, not just those looking to dodge cloud fees but note that Ankers Eufy brand was the one we caught in a privacy scandal.

The new cross-camera tracking feature will be part of a free beta trial at launch, and White says a final version of the feature will be released in the fourth quarter of 2023. Pricing for the feature will arrive later this year but it looks like it will be part of a subscription.

Older cameras that work with the Eufy HomeBase 3 will also add this cross-tracking capability, so you dont have to buy new cameras to get the feature. White says that almost every Eufy camera device, except for some wired cameras and wall light cams, now has full compatibility with HomeBase 3.

Heres a look at the new cameras:

Eufy SoloCam S340 $199.99

The SoloCam S340 can indefinitely power itself from the sun. Image: Eufy

A dual-lens battery-powered outdoor camera with a built-in solar panel, the S340 features a 3K resolution wide-angle lens and a 2K telephoto lens with an 8x hybrid zoom. It also has on-device AI, and the ability to track and zoom on a subject with a 360-degree pan and 70-degree tilt. Color night vision, two-way audio, and a 100-lumen spotlight plus four months of battery life and 8GB onboard storage round out the specs. It is compatible with the HomeBase 3.

Eufy Floodlight Cam E340 $219.99

Eufys Floodlight Cam can also pan and tilt its cameras. Image: Eufy

The first Eufy camera with the ability to record 24/7, the new floodlight camera comes with an onboard micro-SD slot for local storage up to 128GB. A wired camera with two light panels for up to 2,000 lumens of light, its designed to be mounted vertically on a wall. It has 360-degree coverage and can track a person in its field of view. It works with dual-band Wi-Fi 6, has two-way audio, a built-in alarm, lighting schedules, and adjustable brightness.

Eufy Indoor Cam S350 $129.99

Eufys Indoor Cam S350 looks like a baby robot, and can pan and tilt to follow family members throughout the home. Image: Eufy

The new indoor camera has 4K UHD resolution, 8x zoom, and 350-degree pan and tilt. It can be set to patrol points of interest, and AI tracking can keep a subject in view. It also features a privacy shutter and is powered by a USB cable.

Eufy Video Doorbell E340 $179.99

The Video Doorbell E340 looks at the ground so you can see packages / block surprise uppercuts. Image: Eufy

The successor to the Eufy Dual Video Doorbell, the E340 features two cameras a 2K camera with color night vision up top for person detection and a second 1080p camera below for package detection. The doorbell can be installed with wired power or can operate on battery, and as the other cameras announced, it handles up to 60 days of event-based recordings locally (8GB storage built-in), accessible without a subscription.

All four cameras are available starting today on Amazon and Eufys website.

Eufy cameras work with Amazon Alexa and Google Home, but only the Eufy Cam 2 line is compatible with Apple Home. White says the company has been in discussions regarding Matter compatibility, for whenever the new smart home standard supports cameras. Its a possibility, something were looking at, but theres nothing confirmed, he says.

Continue reading here:
Eufy is introducing cross-camera people tracking on its new security ... - The Verge

Read More..

Supermicro Introduces New All-in-One Open RAN System … – PR Newswire

Expanded Edge Server Portfolio Delivers Improved Performance and Power Efficiency for Open RAN and Intelligent Edge Workloads

SAN JOSE, Calif. and LAS VEGAS, Sept. 25, 2023 /PRNewswire/ -- Supermicro, Inc. (NASDAQ: SMCI), a Total IT Solution Manufacturer for AI, Cloud, Storage, and 5G/Edge, announces the expansion of its portfolio of purpose-built servers for Edge AI and Telco workloads. The new Supermicro X13 Edge server, the SYS-211E-FRN13P, delivers a scalable, integrated Distributed Unit (DU) Commercial Off The Shelf (COTS) server. As virtualized Open RAN technology has matured to the point where it's become proven, companies are looking for solutions that enable them to optimize deployments and reduce costs. This solution means shifting the emphasis to attributes such as cost, power consumption, size and weight, and scalability.

"We are very excited to deliver all-in-one servers for the next generation of telco and edge deployments for vRAN and private 5G environments," said Charles Liang, president and CEO of Supermicro. "Our range of telco offerings allows for a more streamlined deployment at scale, which will expand the use of these new technologies to deliver more effective and reliable communication networks at scale."

Explore Supermicro's Edge Servers

Supermicro's latest edge platform is specifically designed to meet those requirements. Based on 4th Gen Intel Xeon Scalable processors with Intel vRAN Boost, it features fully integrated vRAN acceleration that eliminates the need for an external acceleration card, thereby substantially reducing system power requirements and complexity. The system also features an onboard network interface and 12 SFP25G ports, eliminating the need for add-on cards and breakout cables, fully integrated timing support with eight hours of holding time, and a compact, long-life design. The Supermicro SYS-211E systems deliver a fully integrated server optimized for cost, size, and power usage, handling large volumes of traffic at the edge across multiple cell site configurations, including massive MIMO streams.

Learn more about Supermicro's new X13 Edge Server

Learn more about Supermicro 5G Products and Solutions

Additionally, Supermicro is launching a 4-node version of the SuperEdge, a versatile edge server designed to handle a range of demanding workloads at remote network locations. Each of the four nodes in this 2U rackmount server features a single-socket 4th Gen Intel Xeon Scalable processor and runs independently of the other nodes. This enables the system to run multiple workloads in parallel, each with dedicated resources. The Supermicro SYS-211TP offers 2 PCIe 5.0 x16 FHHL slots per node, allowing each individual node to be optimized with add-on cards to match its designated workloads, including running as a DU or Centralized Unit (CU) in RAN networks, MEC, and enterprise edge workloads.

"Supermicro continues to deliver the latest technology to market in their solutions for virtualized RAN and intelligent workloads across the edge," said Cristina Rodriguez, vice president and general manager, Wireless Access Network Division at Intel. "By using our broad portfolio of technology, including the newest 4th Gen Intel Xeon Scalable processors and Data Center GPUs, Supermicro can offer innovative server designs that provide the industry with powerful, highly optimized platforms for a range of use cases at the edge."

Supermicro is bringing new compact edge systems to remote deployments outside the data center, using the latest generation of Intel processors. Among these are the SYS-521AD-TN2 mini-tower, the E102-13R, and the E302-12A systems. The SYS-521AD and E102-13R are both based on 13th Gen Intel Core processors. The SYS-521AD mini-tower is optimized for video processing, streaming, and storage and can be used as an edge server for small and medium businesses. The E102 packs up to 16 cores, 64GB memory, and a range of ports and expansion slots in a mini 1U embedded form factor, ideal for AI inferencing, retail, and signage workloads. The E302, featuring the latest Intel Atom C5000 processor in a fanless compact design, delivers cost-efficient performance to remote locations in a durable, low-noise form factor.

A common feature of Supermicro's new systems for edge workloads is the emphasis on support for GPU accelerators and AI inferencing. An increasing number of these systems are compatible with accelerators, including the NVIDIA A100, L40, L40S, L4, A2, and T1000, the Intel Data Center GPU Flex 140 and the Intel Data Center GPU Flex 170, and even specialized accelerators such as the Hailo-8 AI processor. This flexibility enables customers to use application optimized Supermicro systems at the intelligent edge to match the specific requirements of their workloads, leading to better results and minimizing latency.

Visit Supermicro's booth #814 at MWC Las Vegas, September 26-28, to explore many of its new systems and experience their performance in real-world applications at the intelligent edge.

About Super Micro Computer, Inc.

Supermicro (NASDAQ: SMCI) is a global leader in Application-Optimized Total IT Solutions. Founded and operating in San Jose, California, Supermicro is committed to delivering first to market innovation for Enterprise, Cloud, AI, and 5G Telco/Edge IT Infrastructure. We are transforming into a Total IT Solutions provider with server, AI, storage, IoT, and switch systems, software, and services while delivering advanced high-volume motherboard, power, and chassis products. The products are designed and manufactured in-house (in the US, Taiwan, and the Netherlands), leveraging global operations for scale and efficiency and optimized to improve TCO and reduce environmental impact (Green Computing). The award-winning portfolio of Server Building Block Solutions allows customers to optimize for their exact workload and application by selecting from a broad family of systems built from our flexible and reusable building blocks that support a comprehensive set of form factors, processors, memory, GPUs, storage, networking, power, and cooling solutions (air-conditioned, free air cooling or liquid cooling).

Supermicro, Server Building Block Solutions, and We Keep IT Green are trademarks and/or registered trademarks of Super Micro Computer, Inc.

Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries.

All other brands, names, and trademarks are the property of their respective owners.

Photo -https://mma.prnewswire.com/media/2219244/Picture1.jpg

Logo - https://mma.prnewswire.com/media/1443241/4291994/Supermicro_Logo.jpg

SOURCE Super Micro Computer, Inc.

Go here to read the rest:
Supermicro Introduces New All-in-One Open RAN System ... - PR Newswire

Read More..

Add Windows 11 features to Windows 10 with these helpful tools – PCWorld

Microsoft releases a new version of its two operating systems Windows 10 and 11 every autumn. However, the software company has only delivered new features for Windows 11, limiting Windows 10 to basic upgrades alone.

Windows 10 also received an update a short time later, but you wont find any real innovations. Its limited to improvements that affect the quality, productivity, and security of Windows 10, per Microsoft.

Thats disappointing for Windows 10 users but youre not fully left behind. Heres how you can implement the new Windows 11 features into Windows 10 as well.

After years of stagnation, the Windows File Explorer is increasingly the focus of Microsoft developers. A feature that has been planned for years has made it into Windows Explorer with Windows 11 22H2: displaying multiple folders in different tabs. A new tab can be opened with the key combination Ctrl+T or the + button.

This is practical if you want to switch quickly between folders, for example to copy files. With the impending Windows 11 2023 Update, Microsoft will allow Windows Explorer to unpack Rar and 7-Zip files as well as the tar.gz and tar.bz2 archives commonly used under Linux. So far as with Windows 10 only Zip archives are supported, but unpacking should be faster in the future.

Windows Explorer replica: The Files app hardly differs visually from Windows Explorer. The tool also offers tabs for Windows 10 and additionally a two-panel view.

IDG

For those who are used to Windows Explorer, the Files app is probably the best alternative at the moment. When installed via the Microsoft Store in Windows 10, Files costs $8.99, which you use to support the project. You can find a free download on Github. Files can also be installed free of charge via the Microsoft Store for Windows 11.

Files looks almost like Windows Explorer and can be operated in the same way. However, there are considerably more options. In the context menu of the tabs, for example, you will find Duplicate Tab and Open Tab in New Window. A new tab can be created with Ctrl+T or the + button. If you prefer to work with a split view, you can go to New panel in the three-point menu on the far right of the toolbar. This allows two folders to be displayed side by side.

Configure files: Useful options can be activated in the settings, which can be reached via the cogwheel symbol at the top right. Under General > Start settings you can select Continue where you left off. Files then remembers the open folders and restores the view at the next start. In the Appearance section you can activate the Dark mode and also set other colors for the window.

Dealing with archive files: Via the context menu item Compress, files or folders can be packed into archives in the Zip and 7-Zip formats. Compress > Create archive leads to a dialogue in which you can specify the name and format as well as a password for the encryption. There are also settings for the compression level, and large 7-Zip archives can be saved in smaller individual files (split size). Zip, 7-Zip, and Rar archives can be opened by double-clicking, the contents viewed or files extracted.

The Files App

Microsoft

Other features: For the organization of files and folders, you can assign a tag such as Home or Work via the context menu item Edit tags, which Files displays in the Tags column. In the settings, activate the tags option under General > Widgets. The start page then shows Tags in addition to Quick Access and Drives, and the folders and files can be opened with a mouse click.

It is also possible to search for items with tags if you prefix the search term in the field at the top right with tag:. In the settings under Tags you can change the designations and add new tags.

The disadvantages of Files: In our tests, the tool mostly worked reliably, but occasionally it crashed. Compared to Windows Explorer, Files often opens folders and drives containing numerous items with a slight delay.

Two new Explorer functions of Windows 11 also in version 10: on top the practical tabs, below that the used capacity of OneDrive and other cloud storage.

IDG

It took a long time for Microsoft to add tabs to Windows Explorer. The practical feature, which you know from web browsers, enables the file manager to switch quickly between different folders and to copy or move data conveniently.

With Clover and QT Tab Bar, you have two tools that also equip Explorer with tabs under Windows 10. If you decide to use QT Tab Bar, install the tool, restart the PC, and open Windows Explorer. Now click on the down arrow in the View tab on the right under the Options symbol and activate the Qttabbar list entry. The new tab bar now appears in Windows Explorer.

The tabs are more closely aligned with the new Explorer design of Windows 11 with Clover. With this tool, however, you have to make an effort because the setup wizard shows Chinese characters. This is not a problem, however, because you only need to click the central button at the beginning and end of the installation. After that, the Windows 10 Explorer has tabs.

Microsoft has been planning a function that allows you to group several windows for several years. So far, however, nothing has been seen of this idea in Windows. Groupy offers more or less the functions that Microsoft may be planning for the future, and then some. You can try out the program free of charge for 30 days. A license for five PCs costs $9.99.

The first time you start Groupy, click on Start 30-day trial; then you have to enter your email address. In the confirmation email, click on the activation link.

Groupy adds a second tab bar above the title bar of the windows. In Windows Explorer the bar is always visible, in other windows it is only visible when you move the mouse to the area above the title bar. In Windows Explorer, a click on the + button opens the currently displayed folder again in a new tab. You can now navigate to different folders in both tabs. If you drag a file with the mouse from the Windows Explorer window to the other tab, the tab is activated and the file can be moved to the folder. To copy the file, press the Ctrl key.

Tabs with Groupy: The tool builds a bar with tabs above the title bar. These can display not only folders, but almost any application.

Groupy

Proceed in the same way for any other program: Open Windows Explorer and, for example, the Windows editor Notepad. Click on the Notepad window in the Groupy bar with the mouse and, while holding down the left mouse button, drag it to the upper area of the Windows Explorer window until Insert into group here appears. As soon as you release the mouse button, a new tab appears. Repeat this with all windows that you want to dock as tabs. You can undock a tab by simply dragging it away from the window and onto the desktop.

Alternatively, use the small button with the down arrow in the Groupy bar. In the menu you will see a list of open windows. Select the window for which you want to create a tab. Use All unordered windows on this monitor to create tabs for all windows.

You want to continue where you left off? In the context menu of a Groupy tab, select Group > Save group as, give it a meaningful name and click on Save. To load the group again later, go to Group > Saved Groups in the context menu and select the desired one.

In addition, the Explorer in Windows 11 now shows the data fill level of OneDrive. This useful feature can also be retrofitted and extended to other cloud storage right away.

This is how it works: Install Raidrive including the Runtime and Visual C++ modules that may be additionally required. After starting the program, click on + Add at the top, activate the entry OneDrive in the Personal tab and select a free drive letter for Drive. Clicking on Connect takes you to the OneDrive login, where you log in with your Microsoft account and allow Raidrive access to your cloud. To see the used storage space in Windows, click on View > Tiles or View > Contents in Explorer. In the same way, you can integrate Dropbox, Google Drive, et al into Explorer via Raidrive.

The Windowgrid tool makes it possible to arrange open program windows on the computer monitor as desired via the desktop grid that appears.

IDG

With Windows 11 version 22H2, Microsoft has expanded the Snap Layouts for quickly arranging program windows on the screen with the new Snap Bar. Instead of clicking precisely on a function icon, one simply drags an open window to the top of the desktop and selects the desired predefined placement.

Windowgrid also makes it possible to arrange windows under Windows 10. To do this, click the left mouse button at the top of the window, move it minimally, and now also press the right mouse button. This places a grid over the desktop. As soon as you release the right mouse button, you can place the window in the grid as desired by dragging it with the left button. Sounds complicated, but trial and error makes it immediately clear.

Another alternative is the Microsoft PowerToys (in the Microsoft Store) with the FancyZones feature. Because FancyZones offers many possibilities for placing and docking windows, the tool requires some training. Fortunately, weve got a FancyZones primer to help you wrap your head around it.

Also not limited to Windows 11 is the new video editor app Clipchamp, which is available under Windows 10 in the Microsoft Store. If the Store and Store apps are up to date, Windows 10 automatically offers the new app when the previous Video Editor app is called up.

Microsoft highlights the new efficiency mode in the task manager of Windows 11 version 22H2. This is not a unique feature, however, because it is already included in Insider builds of Windows 10 as Eco Mode.

It has not yet made it into the regular autumn update, but even without it you can prioritize processes without terminating them completely. To do this, right-click on a process entry in the Details tab of the task manager and continue with Set priority: In the test, the Low option reduced the CPU load of a demo application from almost 100 to 65 percent.

Sandboxie offers a safe environment to try out unknown software without risk. Virtual systems with Virtual Box or Vmware Player require more effort

Foundry

Microsoft has already introduced the sandbox for safely trying out software and system settings in 2019, i.e. in Windows 10 Pro and Enterprise. It has remained limited to the two professional versions, but since the autumn update, the virtual system no longer loses all changes when rebooting as it did before but only under Windows 11. Sandboxie Plus does the same and runs under Windows 10 including Home Edition.

This is how it works: After installing and starting Sandboxie, right-click on Sandbox DefaultBox > Start in Sandbox > Start Program > Search on the control surface. In the Programs or Programs (x86) folder, select the executable file of the software that is to start in the sandbox. In the next step, activate the option Start as UAC administrator, click on OK and, depending on the setting of the user account control (UAC), on Yes.

If you want to set up Windows as a complete virtual system, use VirtualBox or VMware Workstation Player. The ISO files for installing Windows 10 or 11 are available from Microsoft at http://www.microsoft.com/software-download.

The instant recovery tools Reboot Restore Rx and Time Freeze are convenient alternatives. They discard all changes made in the meantime and thus also remove potential malware. The tools also work with virtualization, but the virtual system does not have to be restarted.

With Smart App Control (SAC), Microsoft has introduced a function in Windows 11 version 22H2 that was previously reserved for Windows in S-mode namely, to rigorously block everything that the system does not explicitly classify as harmless. This is a significant difference to conventional virus protection. The latter only warns or blocks what probably contains malicious code.

The new function uses artificial intelligence and certificate-based signatures. How it will prove itself in practice remains to be seen. SAC does not currently provide for manually set exceptions to be able to run programs that are classified as unsafe. Microsoft is also aware that Smart App Control is a radical step: SAC can only be used on PCs with freshly installed Windows 11. Those who updated their PC to version 22H2 in autumn must first reset it.

Initially, the feature runs in evaluation mode. This analyzes the individual user behavior and then decides whether the protection seems to make sense in concrete use. Then it switches on automatically after some time; this can be done manually at any time. Once a protection mode has been activated, it can easily be deactivated again, but it cannot be reactivated without resetting Windows 11. Since the installed software is lost when resetting, this step needs to be carefully considered.

Windows 10, on the other hand, does not support the third-party Smart App Control, but offers a number of similar protection mechanisms. Microsoft summarizes these in the Settings app under Windows Security.

Some, such as virus, real-time, tamper and cloud-based protection, as well as the firewall, are turned on by default. But the other protection functions that are not automatically activated are also useful. These include Monitored Folder Access against ransomware (under Virus and Threat Protection), blocking potentially unwanted apps and downloads, and only in the Pro version Microsoft Defender Application Guard (both under App & Browser Control Reliability-Based Protection). Depending on the CPU, you will also find Core Isolation under Device Security.

This juxtaposition of different security functions is partly responsible for the fact that Windows sometimes also warns falsely. For example, if the system warns you when installing an established program, download it again from the manufacturer or from a secure source. You can also check the file with an online multiscanner such as Virustotal or Jottis Malware Scan before setting it up. If it gives the green light, install the software via More information > Run anyway at your own risk. Many security suites also offer real-time and system protection.

Screenshots can be created with the app Cut and Sketch (Snipping Tool), which can be called up most easily with the key combination Win+Shift+S. In the bar, you can select the screenshot you want to take. In the bar you can choose whether you want to cut out a window or an area. The image is on the clipboard, and a click on the notification opens it in the app, which offers some editing functions.

Whatever Microsoft plans for the Cut and Sketch app, the free tool Greenshot can probably already do it. After installation, it appears as an icon in the notification area next to the clock and registers the key combinations Print key (area), Alt-Press (window), and Ctrl-Press (desktop) for itself. The key combinations can be changed in the settings. After Ctrl-press, for example, a menu appears that can be used to save the screenshot, copy it to the clipboard, or open it in the Greeenshot Editor. The editor offers numerous editing functions. You can draw lines or rectangles in the image and add speech bubbles and text boxes.

The Bluestacks App Player offers access to Googles official Play Store as well as the possibility to install Android apps as APK files on Windows PCs.

IDG

Finally, can Android apps also be installed under Windows 10 i.e. one of the core functions of Windows 11, which Microsoft had to postpone at the launch of the operating system in autumn 2021 for performance reasons and which was only released a year later.

The Windows subsystem for Android and thus the apps themselves are not available with Windows 10. But the Android emulator Bluestacks App Player brings the Android apps into the predecessor system as well.

This is how it works: Run the Bluestacks installer and wait until all the necessary program files have been downloaded and the emulator has been installed. You can start the actual player with all its functions by clicking on the symbol at the bottom left of the Bluestacks interface. The function bar on the right offers so many possibilities that you need some time to get to know and set up the new system. This includes the fact that after logging into the integrated Google Play Store, almost all Android apps are available in principle. In addition, Bluestacks offers cross-system functions, such as file and media access via Windows Explorer.

Since the release of Chat-GPT, everyone has the option of getting help from an AI. Microsoft, which has a stake in Open AI, has meanwhile integrated the AI chat into the Edge web browser. It can be accessed via the icon (Discover) in the sidebar. The function is available to users of Windows 10 and 11.

Insider build 23493 of Windows 11 already shows what Microsoft plans to do with AI in the future. The key combination Win + C now no longer leads to Cortana, but to Windows Copilot. The window looks like Edge and you can type a question into the input line to communicate with the AI. However, commands that are started on the PC are also possible. Take a screenshot, for example, searches for an action and then opens the snipping tool. Switch to dark mode activates the dark mode for Windows and apps after a query.

So far, Windows Copilot is still a pre-release version; the features are limited. Microsoft will probably not offer Copilot for Windows 10. To our knowledge, there are no alternative programs. With the rapid development in the field of artificial intelligence, other providers will certainly also work on tools for PC control via AI.

This article was translated from German to English and originally appeared on pcwelt.de.

Read the original post:
Add Windows 11 features to Windows 10 with these helpful tools - PCWorld

Read More..

Security Think Tank: To encrypt or not to encrypt, that is the question – ComputerWeekly.com

To encrypt or not to encrypt, that is the question.

Well, you should of course be encrypting data in motion particularly where its to be carried over a third party network such as the internet or the networks operated by data warehouses where cloud service providers (CSPs) locate their equipment.

Given remote working and the move of IT into the cloud, where your services will be operated in a multi-tenanted environment, you should be seriously looking at encrypting data flowing between systems as well as between users and systems.

One example is to encrypt the data flow between an email app such as Outlook and the email server such as Microsoft Exchange. This is easily done via set-up menus and should be done even if the main communications channel itself is encrypted.

Encrypting between systems is recommended though you will need to review individual applications to see what is available, here however Microsoft servers can encrypt end to end using the Server Message Block (SMB) encryption feature.

Websites both internally and externally facing should be using HTTPS by default and service and maintenance access to systems should also be encrypted (SSH, HTTPS, proprietary) remember though that data must be in the clear (decrypted) before it can be processed, we are a few years away from applications being able to process encrypted data directly in an economic way.

Meanwhile, data at rest should also be encrypted, and most database systems offer encryption selectable between fields or records. Some file systems (both hardware and software based) can offer encryption, examples being Microsoft BitLocker on Windows 10 and later (and Server 2008 and later) although for Windows 11 there is a requirement for TPM 2.0 (Trusted Platform Module) support.

For Microsoft Server 2016 TPM is not a requirement but is recommended, however if Host Guardian services are required then TPM 2.0 is a definite requirement.

Other encrypting file systems include APFS on macOS 10.3 and later; Ext4 on Linux kernel 4.1 and Novel Storage Services. See Wikipedia for more examples.

Encryption of data at rest is not a panacea or silver bullet as far as protecting data is concerned. It wont for instance protect you if a ransomware gang accesses your data store, it will quite happily encrypt your encrypted data! As such a well-designed and engineered infrastructure with careful attention to security configurations throughout is a must! Put least privilege into action, and CISOs, if your managers or board complain, get them to sign a formal affidavit that they understand the risks of not limiting file access according to an absolute business need and that they are therefore fully liable should things go wrong.

Weve talked about the need for encryption and where, but what standards should we be aspiring to? My recommendations are:

Remember that data at rest includes not just the data on file and database servers, it includes email systems and peoples PCs, laptops, smartphones and USB devices. And data in motion doesnt only travel between systems or systems and user devices but also sent to printers and backup systems. Dont forget them whatever you do. They could be the back door you forgot!

Read more from the original source:
Security Think Tank: To encrypt or not to encrypt, that is the question - ComputerWeekly.com

Read More..

AWS And Anthropic: 5 Key AI Chip Supply Chain Plans, Intel Hires … – CRN

Cloud News Mark Haranas September 26, 2023, 12:05 PM EDT

AWS CEO Adam Selipsky said his $85 billion cloud company is ramping up production to provide a very robust AWS-controlled supply chain for AI chips, thanks to Amazons $4 billion investment in Anthropic.

AWS CEO Adam Selipsky (pictured) has massive plans in store for Amazons blockbuster investment of up to $4 billion in AI startup star Anthropicfrom shaking up the chip industry to hiring new talent from the likes of Intel.

Its absolutely true that there is a huge demand for all of the different chips with which people do generative AI workloads on, Selipsky said in an interview with Bloomberg. So we absolutely have already been ramping up our Trainium and Inferentia supply chain and ramping up the supply that we can create as quickly as possible.

CRN breaks down the biggest plans and initiatives Amazon has for Anthropic, CEO Selipskys boldest statements on his AI chip vision, and new AWS executive hires from Intel.

[Related: Adam Selipsky: AWS Is Building The Ultimate GenAI Tool Chest]

Before jumping into AWS and Amazons plans for the AI startup, its key to note the technologies and strategy at play.

The Seattle-based $85 billion cloud giant is looking to become a global AI chip maker via its custom silicon offers Trainium and Inferentia. Anthropic is part of AWS plan to succeed as the popular AI startup will use AWS Trainium and Inferentia chips to build, train, and deploy its future foundation models. The two companies will also collaborate in the development of future Trainium and Inferentia technology.

Theres still a whole lot of storage and compute and database workloads ramping up on AWS. So we have many sources of growth, I anticipate, but theres absolutely no doubt that generative AI looks like its going to be an explosive additional source of growth in the years ahead, said Selipsky.

Heres five key things to know about AWS custom silicon chip plans and boldest remarks from Selipsky and Anthropic CEO Dario Amodeia former vice president at OpenAIthat investors, channel partners and customers need to know.

Mark Haranas is an assistant news editor and longtime journalist now covering cloud, multicloud, software, SaaS and channel partners at CRN. He speaks with world-renown CEOs and IT experts as well as covering breaking news and live events while also managing several CRN reporters. He can be reached at mharanas@thechannelcompany.com.

See the original post:
AWS And Anthropic: 5 Key AI Chip Supply Chain Plans, Intel Hires ... - CRN

Read More..

Data Mining Tools Market is Expected to Gain USD 2045.79 Million … – Benzinga

The significant Data Mining Tools Market report focuses on specific stock, currency, commodity and geographic region or country. This report is a great source of information for the major happenings and industry insights which is very valuable to thrive in this competitive age. Market research analysis and data lend a hand to businesses for the planning of strategies related to investment, revenue generation, production, product launches, costing, inventory, purchasing and marketing. The market report deeply analyses the potential of the market with respect to current scenario and the future prospects by considering several industry aspects. Thorough and transparent research studies conducted by a team work of experts in their own domain accomplish an international Data Mining Tools Market research report.

Data Bridge Market Research analyses that the data mining tools market is expected to reach USD 2045.79 million by 2030, which is USD 832.19 million in 2022, at a CAGR of 11.90% during the forecast period. In addition to the market insights such as market value, growth rate, market segments, geographical coverage, market players, and market scenario, the market report curated by the Data Bridge Market Research team includes in-depth expert analysis, import/export analysis, pricing analysis, production consumption analysis, and pestle analysis.

Get a Sample PDF of Data Mining Tools Market Research Report: https://www.databridgemarketresearch.com/request-a-sample/?dbmr=global-data-mining-tools-market

The growth in data generation, data mining and data storage across different organizations, such as banking, process manufacturing, marketing and ongoing digital transformation, will enhance the market growth during the forecast period. Data mining aids marketing companies to create prototypes based on past data to calculate appropriate results showcasing new marketing operations such as direct mail, online marketing operations, and others and ensuring marketers to have a precise method to provide personalized products to targeted customers.

Top Leading Key Players of Data Mining Tools Market:

Key Opportunities:

Data mining tools bring many profits to retail companies, through data analysis. These devices have a proper production and sales process that highlighting on repeated purchasing products instantaneously by consumers. Furthermore, it also increases retail corporations to provide discounts for products that will attract more and more consumers. Also, a rise in demand for artificial intelligence technologies and machine learning creates numerous opportunities for the growth of the marketcreates numerous opportunities for the market. Moreover, the surge in the need for embedded intelligence and the increasing requirement for generating insights from raw data to gain a competitive benefit are also anticipated to make major advantages in the market.

To Gain More Insights into the Market Analysis, Browse Summary of the Data Mining Tools Market Report@ https://www.databridgemarketresearch.com/reports/global-data-mining-tools-market

Global Data Mining Tools Market Segmentations:

Component

Service Managed Service

Business Function

Industry Vertical

Deployment Type

Organization Size

Data Mining Tools Market Country Level Analysis

The countries covered in the data mining tools market report are U.S., Canada and Mexico in North America, Germany, France, U.K., Netherlands, Switzerland, Belgium, Russia, Italy, Spain, Turkey, Rest of Europe in Europe, China, Japan, India, South Korea, Singapore, Malaysia, Australia, Thailand, Indonesia, Philippines, Rest of Asia-Pacific (APAC) in the Asia-Pacific (APAC), Saudi Arabia, U.A.E, Israel, Egypt, South Africa, Rest of Middle East and Africa (MEA) as a part of Middle East and Africa (MEA), Brazil, Argentina and Rest of South America as part of South America.

The country section of the report also provides individual market impacting factors and changes in market regulation that impact the current and future trends of the market. Data points like down-stream and upstream value chain analysis, technical trends and porter's five forces analysis, case studies are some of the pointers used to forecast the market scenario for individual countries. Also, the presence and availability of Global brands and their challenges faced due to large or scarce competition from local and domestic brands, impact of domestic tariffs and trade routes are considered while providing forecast analysis of the country data.

New Business Strategies, Challenges & Policies are mentioned in Table of Content, Request TOC: https://www.databridgemarketresearch.com/toc/?dbmr=global-data-mining-tools-market

Browse More DBMR Reports:

https://www.databridgemarketresearch.com/reports/global-digital-based-radiography-market

https://www.databridgemarketresearch.com/reports/global-digital-experience-platform-market

https://www.databridgemarketresearch.com/reports/global-digital-twin-financial-services-and-insurance-market

https://www.databridgemarketresearch.com/reports/global-discrete-semiconductor-market

https://www.databridgemarketresearch.com/reports/global-disk-encryption-market

About Data Bridge Market Research, Private Ltd

Data Bridge Market Research Pvt Ltd is a multinational management consulting firm with offices in India and Canada. As an innovative and neoteric market analysis and advisory company with unmatched durability level and advanced approaches. We are committed to uncover the best consumer prospects and to foster useful knowledge for your company to succeed in the market.

Data Bridge Market Research is a result of sheer wisdom and practice that was conceived and built-in Pune in the year 2015. The company came into existence from the healthcare department with far fewer employees intending to cover the whole market while providing the best class analysis. Later, the company widened its departments, as well as expands their reach by opening a new office in Gurugram location in the year 2018, where a team of highly qualified personnel joins hands for the growth of the company. "Even in the tough times of COVID-19 where the Virus slowed down everything around the world, the dedicated Team of Data Bridge Market Research worked round the clock to provide quality and support to our client base, which also tells about the excellence in our sleeve."

Data Bridge Market Research has over 500 analysts working in different industries. We have catered more than 40% of the fortune 500 companies globally and have a network of more than 5000+ clientele around the globe.

Contact Us

US: +1 888 387 2818UK: +44 208 089 1725Hong Kong: +852 8192 7475Email - corporatesales@databridgemarketresearch.com

COMTEX_441132040/2657/2023-09-28T08:21:40

See original here:

Data Mining Tools Market is Expected to Gain USD 2045.79 Million ... - Benzinga

Read More..

What Is Diagnostic Analytics? (Definition, Examples) – Built In

Diagnostic analytics is a branch of analytics concerned with using data analysis techniques to understand the root causes behind certain data points. We use diagnostic analysis techniques to answer the Why did this happen? question when looking at historical data from a business, practice or process.

Diagnostic analytics is a form of root cause analysis that explores outliers in our data set and helps us understand why something happened. Organizations use diagnostic analysis techniques for a wide variety of applications including process improvement and equipment maintenance. If our sales dropped 15 percent between February and March, we can use diagnostic analysis methods to help us understand the cause behind the steep decline.

Basic to Basics on Built InWhat Is Data Analytics?

There are multiple ways a company or analyst can conduct an effective diagnostic analytics workflow.Heres an overview of the main methods we associate with diagnostic analytics.

Data drilling consists of performing deeper dives into specific data sets to explore and discover trends that are not immediately visible when looking at aggregated data.

For example, a business looking to understand how many hours its employees spend on manual tasks may start by obtaining a global table of all its people. They might then drill down by region, line of business or type of role to get a more granular (or a drilled down) sense of how manual work is allocated across the employee base.

There are several techniques and modern software available to do this effectively, from simple spreadsheets to more advanced data processing and visualization tools.

Mining data requires a deeper level of processing compared to data drilling, but its goal is the same to understand key patterns and trends. We typically associate data mining with six common groups of tasks through which we can reveal patterns.

Anomaly detection involves tasks targeting the identification of outliers or extreme data points in a vast set of data.

Dependency modeling targets the identification of specific associations between data points that may otherwise go undetected. For example, an electronics company may discover that customer reviews often mention Product A and Product B together and act on that information by placing those products together in a display.

These tasks segment data into similar clusters based on the degree of similarity across data points. Clustering could allow a beauty shop to determine similar groups of customers and advertise to them accordingly.

Classification tasks target the categorization of data points to recognize and classify future data points into specific groups. Classification allows cybersecurity software companies to analyze email data and separate phishing emails from harmless email content.

Regression tasks extract a function that models the relationship between data points according to a specific equation that captures the relationship between different variables at play.

Summarization tasks condense data for easier reporting and consumption while also avoiding the loss of more valuable, granular information we can use for clearer decision making.

More From Built In ExpertsWhat Does Linear Regression Tell You?

Correlation analysis is concerned with understanding and quantifying the strength of the relationship among different data variables in a given set of data points. Correlation is helpful in diagnostic analytics processes concerned with understanding to what degree different trends in the data are usually linked.

Correlation analysis is helpful as a preliminary step in causal analysis, which is a branch of statistics concerned with not only determining the relationship between variables but also the causal process between them.

For example, data may show that sales of pet food are strongly correlated with weather patterns, but it may not be the case that changes in weather cause changes in the level of pet food sales. Wed use causal analysis to answer the latter half of this question.

An error occurred.

Understanding specific processes and leveraging diagnostic analytics techniques to identify root causes is a key use case for this methodology across industries. Lets say were wondering why a particular step in a workflow or manufacturing process is taking longer than average. If we use some of the techniques laid out above, we can map the process from start to finish and gather enough data to answer the question. Diagnostic analytics can help us correct course and improve overall process performance.

The marketing funnel is the sequence of marketing activities that funnel customers, or potential customers, all the way from initial awareness down to product conversion. Understanding the marketing funnel and its data is of critical importance to help companies effectively allocate advertising budgets.

Diagnostic analytics around marketing initiatives are especially important at the early stages of a companys growth. These workflows support frequent iteration and feedback to direct the organizations next best action.

On That Note. . .What Is Marketing Analytics?

Most heavy industrial machinery generates data that informs its functioning and maintenance lifecycle. In this context, diagnostic analytics can help raise alerts regarding the health status of capital-expensive equipment before its too late, thus avoiding costly replacement orders or halting production lines.

We can use diagnostic analytics to study inter-company communication flows and understand whether certain departments are collaborating enough, which communication channels are most used (email, internal chats, video calls) and which employee roles contribute to the bulk of the communication flow. We can perform these analyses on anonymized, aggregated data so individuals are not identifiable. At the same time, the company can derive insights and put them to use to improve internal communication practices.

Descriptive analytics workflows are concerned with providing a historical view or summary of the data. Examples include sales reports and quarterly financial results released periodically by publicly traded companies.

Prescriptive analytics workflows are concerned with providing recommendations and suggesting the next best action to take in a given context. For example, Netflix movie recommendations delivered to the user are derived from prescriptive analytics techniques.

Predictive analytics is concerned with providing insights and forecasts into the future so the organization or data consumer can prepare for the most probable scenario. Time series forecasting and weather predictions are based on prescriptive analytics techniques.

With the above in mind, its easier to appreciate how diagnostic analytics techniques fit into the bigger picture of how we use data to achieve a variety of goals. Where other branches of analytics target what like questions, diagnostic analytics addresses why questions.

Continued here:

What Is Diagnostic Analytics? (Definition, Examples) - Built In

Read More..

How To Enable an Effective Business Intelligence Strategy – Software Advice

On this page:

In the modern digital age, businesses have access to more data than ever before. However, the data can be overwhelming if not managed properly. To prevent this, every business needs an effective business intelligence (BI) strategy designed to capture, process, and visualize all of the data thats available to them.

In the past, a majority of this analytical data capture and processing was solely the domain of IT departments. However, new tools and platforms are allowing this work to be done throughout an organization.

A Gartner report showed that 67% of the CEOs they polled want this type of technology work to happen within business functions and not just within IT. [1] That means a modern BI strategy must be accessible across departments and by various management and team members at all levels of operations.

Below, well outline the steps needed to create and implement a modern BI strategy that works for businesses of any size.

An effective business intelligence strategy is a series of methods and protocols for capturing critical data and processing that data to reveal key trends and opportunities. This is done through data mining and data visualization to allow executives and managers access to the data as well as to create their own queries.

One misconception is that BI is a tool mostly for large corporations due to its perceived complexity. However, this is not true, and often small businesses have the most gain from a BI strategy.

Small businesses are more sensitive and more at risk from damage caused by poor decisions, inefficiency, or quickly changing market conditions. Larger corporations can often weather sustained losses due to these events. But a small business has fewer resources to weather such situations. This means that being nimble and mitigating risk is something that a small business needs to prioritize, and a sound BI strategy allows them to do that.

Before implementing a business intelligence strategy, its important to consider the key areas where you need to be successful.

Not all organizations will require the same strategy. Some may be more focused on internal analytics while others may be more interested in competitor analysis to help them find areas where they can match or exceed others in the marketplace.

Setting the scope and goals for your strategy is critical to prevent data collection and analysis from becoming overwhelming and causing more harm than good.

By having a clearly defined scope and goal, targeting data collection can help provide much more accurate forecasting and actionable information.

Its also important for determining the ROI of your BI strategy and spending. Without knowing what areas of your business you want to improve through BI, its impossible to quantify the benefits for the purpose of ROI analysis.

Youll first need to choose an executive sponsor. This will be the person tasked with overseeing the strategy and ensuring that the various components stay on track during the implementation. This executive can also be the Chief Data Officer, or they can appoint one if different departments will be involved in reporting.

Youll also want to bring in managers and other department members who will be accessing the data. For implementation, this will likely be the responsibility of your IT department and assigning roles to key points of contact to various tasks such as platform selection, security, and deployment.

During this process, you want to find the tools and platform that best fit your strategy and goals. Start by comparing BI tools so you can understand any overlapping features, such as data visualization and other common modules, and any budget constraints.

Finally, youll want to determine if you are going to implement a traditional BI strategy or a self-service strategy. This will mostly depend on your internal business structure, such as whether you have an IT department capable of conducting a complex implementation. Youll also base this on your budget and goals for your BI strategy. Keep in mind that simpler goals and a smaller scope may only require a self-service approach.

By now, you should have everything you need to start to map out your implementation of both your strategy and your BI tools of choice. This involves things such as mapping out your data structure and preparing it for your strategy.

Each step in your roadmap should have a reasonable date or timeframe for completion and is accepted by other stakeholders. Make sure every step has a clearly defined milestone attached to it that signifies its completion and overall progress.

Launching your strategy will involve two key phases to ensure all data is being processed correctly and is accurate.

User acceptance testing (UAT): This first phase tests any data processing or transformations to make sure they are accurate and that the proper reports are being produced.

Training end users: This can be the individual managers or executives who will be interacting with the system. The software vendor you chose for your BI tools should offer training to assist with this step. You can also create your own training materials based on your specific environment.

Review your process to ensure your strategy is achieving the goals set out earlier. This can be a measure of ROI on your total BI spending compared to the goals. For example, if the goal was to reduce waste during shipping by 5%, that cost can be calculated and compared against your BI spending.

This final step is vital, especially for small businesses. Additional software expenditures can often be met with skepticism as they can add complexity and cost to your business without knowing if any results have been achieved.

By always measuring your spending and comparing it to the goals that you originally identified, it becomes easy to determine the ROI. This is also why setting your initial goals for your BI strategy is important, so make sure to be as specific as possible.

For most organizations, much of this implementation will be handled by their IT department. Its important that the IT staff understands who they report to, which is usually a member of the BI team.

Since the implementation will require new cooperation across the business, it should be clear how each department and the BI team fit into the overall organizational structure.

Security also needs to be a priority during this process. BI involves capturing and processing large amounts of data, some of which may be sensitive. Security and permissions need to be carefully written so that the flow of data is securely available to only those individuals and applications who need it.

Implementing a BI strategy may seem difficult, but as ABI becomes a core part of business operations, its vital that companies adopt a strategy that allows them to make informed decisions that lead to positive results with reduced risk.

By segmenting the BI implementation process and carefully outlining the goals and scope of your strategy, the entire process becomes easier.

During your BI strategy research, make sure to leverage online resources such as those offered by Software Advice to learn more about BI platforms. These resources help you compare BI software tools as well as look into reviews by other businesses that have had experience with them.

Go here to see the original:

How To Enable an Effective Business Intelligence Strategy - Software Advice

Read More..