Page 20«..10..19202122..3040..»

Unlocking the promise of a connected world through edge cloud … – ITProPortal

Internet of Things has started to impact every aspect of our daily lives. Our appliances, cars, gadgets, communication devices, tools, and even some of our clothing have become nodes on the internet. By 2020, as many as 50 billion devices will be connected so there will be skyrocketing growth of traffic generated by devices at the edge of the network posing a monumental challenge to our networks and central cloud computing. Thankfully, we can take advantage of the ever-increasing computing capabilities of edge devices to turn them into cloud servers and extend central-cloud capabilities to the edge. Edge cloud to central cloud is like WiFi to Cellular Communication. Just as WiFi carries most of the global wireless traffic today, edge devices will soon manage most of the cloud computing burden.

According to Gartners 2017 Hype Cycle for Emerging Technologies, edge cloud computing is on the brink of becoming an innovation trigger. Microchips and sensors continue to become embedded in everyday objects, making edge cloud computing an immense opportunity. There are billions of potential edge servers today and there will be tens of billions more tomorrow.

An illustrative example: self-driving cars

Self-driving cars have many potential benefits: optimized traffic flow, improved fuel efficiency, reduced accidents, and drivers with more free time. However, a big challenge for self-driving cars is to find a cost-effective way to process the vast amounts of data that they generate. On average, every self-driving car generates approximately one GByte/sec of data which is orders of magnitude more data than the capacity of a 4G base station and an order of magnitude large than a 5G base station. There are no networks in the foreseeable future that can be used to send all this data back to the central cloud for processing. Most of this data needs to be processed locally and only a minimal set of processed data should be sent back to the central cloud for global coordination. Moreover, in order to meet latency requirements to take agile decisions, self-driving cars should communicate in the fastest way possible. This demands instantaneous processing of information and when possible peer to peer communication. This is where distributed edge cloud computing comes into play, transforming cars to data centers on wheels where most of the communication and processing is performed as close as possible to the edge.

In a distributed edge cloud environment, every car can be a set of microservices that can sense other cars not only because of all the sensors but also because their microservices can communicate with microservices of other cars. To make this work, microservices (within a car and between cars) need to form ad-hoc clusters based on scopes such as proximity, network or account affinity. This way, cars can process the terabytes of data as quickly and as efficiently as possible leveraging not only the processing power at the central cloud but also their own collective computing, storage and memory resources in a collaborative fashion with other cars on the road.

Edge cloud computing is the next major computing revolution

Throughout the last few decades, computing has experienced different cycles shifting between centralized and distributed. In the early days, computing was centralized with mainframe computing. In the mid-80s, with the advent of personal computers the industry shifted to distributed computing. In the last decade, we witnessed the move to centralized cloud computing. Many falsely predicted that this was the holy grail of computing and we would move to an era of thin clients where devices would be dumb screens and input devices and all the processing would be performed in data centres in the cloud. This made sense for some applications such as music or video streaming or hosting some software applications. In all these cases, edge devices do not generate too much data and are mostly passive receivers of information.

In the last few years, two major socio-technical trends have contributed to a fundamental change in production and consumption of data.

First, thanks to mobile internet and social media, ordinary people generate massive amounts of data turning them from mere consumers to consumers and producers. For example, today close to 500 million photos are uploaded on Facebook and Instagram and roughly 500 thousand hours of video is uploaded to YouTube daily; this is more than what the three major US networks generate in content over two years!!! This is not a consumer phenomenon but also applies to enterprises; for instance, more than 80% of businesses have started to leverage user-generated content in their marketing efforts.

Second, we have the rapid growth of IoT devices where many new edge devices produce valuable data. There are already 20 billion connected devices, 2.5 billion of which are B2B IoT devices. Over 300M wearable devices were sold in 2017 alone. Many of these devices generate small amounts of data but many generate massive amounts of data; for example, when video meets IoT. We apply machine-learning algorithms to video feeds allowing cameras to recognize people, objects, and situations automatically. There will be phenomenal growth in AR/VR in the gaming industry, and even the enterprise starting with creative applications and quickly moving to brick and mortar industries and manufacturing. Robots will also be producers of massive amounts of data at the edge.

Clearly, we are amid an explosion of data generated at the edge and the tsunami is yet to come. The question is can our communication networks scale to cope with the data generated at the edge? To try to answer this, we can look at two predictors: the Moores law in computing and its equivalent in network bandwidth. History has shown that computing power roughly doubles every 18 months (or hundred times every decade) whereas network bandwidth grows about 50 times every decade. In other words, even if the number of devices do not grow (which they clearly will), the communication network will be the bottleneck for the growth of IoT.

Setting bandwidth aside, many IoT applications such as self-driving cars or tactile control communications in various industries require low latency response. In this case, even if the network capacity is miraculously increased to cope with the data, laws of physics inhibit remote processing of data in the central cloud due to large latencies in the long-haul transmission of data.

So, what is the solution? How can we cope with the explosion of data at the edge and strict latency requirements of some IoT applications? The answer is distributed edge cloud computing. Edge cloud computing means that any device (or node) becomes a cloud server. As much as possible, the data is processed at the edge of a network, as close to the originating source as possible, instead of processing everything in the central cloud. This approach is faster, more efficient, and scalable: data can be immediately analysed and put into action overcoming bandwidth limitations and latency constraints on the network. Edge cloud computing is essential to meet stringent requirements on bandwidth and latency and at the same time minimizes power consumption and infrastructure costs.

Edge cloud computing is a paradigm shift that enables every device, appliance, or gadget to communicate and share resources making them part of the solution for scaling of IoT. It allows drones and robots to harness their collective resources in industries such as manufacturing, oil and gas, agriculture or mining, delivering real-time data and improving business efficiency. This new computing model will revolutionize the world in ways that we may not be able to predict at this moment.

The great news is that the technology is ready for developers today. mimik has developed a fully distributed edge cloud platform that extends central cloud to the edge: mimik arms developers with a platform to unleash the power of edge devices. mimik SDK solves many of the current challenges that centralized cloud computing alone cannot address. We extend the power of the central cloud to the edge and boost its reach so that bandwidth, latency constraints, and infrastructure cost do not become the bottleneck for the healthy and scalable growth of IoT.

We need a paradigm shift that transforms tens of billions of devices from a challenge to an opportunity. IoT requires a revolution in computing that unlocks the power of connected devices. Distributed edge cloud is the ideal solution to harness computing resources at the edge, unlocking the promise of a smart connected world that will bring massive efficiencies to enterprises and digital freedom to consumers.

Siavash Alamouti, CEO at mimik

Image Credit: Jamesteohart / Shutterstock

Read more:
Unlocking the promise of a connected world through edge cloud … – ITProPortal

Read More..

So you’re already in the cloud but need to come back down to Earth – The Register

We generally think of a transformation to a hybrid infrastructure as one where you’re going from a completely private setup to one that spans the public cloud and your private installation. But what if you started life as a small company with your systems entirely in the cloud? It’s not an unusual approach, as running up your initial services in the cloud is straightforward and avoids a big capital outlay. As a company grows it’s understandable that it might want to take on a private data centre, build an in-house support team and evolve to a two-site setup.

Step one is to consider why you’re bothering with an on-premises setup instead of a second cloud instance. The answer will generally be that you want something that’s closer to your office, with a potential performance improvement gained from such proximity. And that’s fine what matters is that you’ve considered the options before deciding which way to go.

The next step is to think about where you’ll host your private data centre. As you’re already in the cloud, you have the opportunity to pick a data centre that’s close (electronically speaking) to the cloud centre you’re in. For example, you’re probably aware that AWS provides a Direct Connect facility that lets you hook straight into their infrastructure rather than accessing your cloud world over the internet. Check out the locations and you’ll see that the connectivity’s hosted at 51 well-known locations Equinix in London, for example, or TierPoint in Seattle. Connectivity between your public and private components with a latency of just a few milliseconds is an attractive concept if you’re looking for high availability with seamless failover.

Next, you’ll need to think about the platform you’re using. Most of the time you’ll have used one or more of your cloud provider’s standard operating system templates, so it makes sense to run your local stuff on the same operating system flavour if you can. And of course you should use the same CPU architecture where you can too, so you can be assured that your apps will be portable.

So you’ve sorted the platform. Now you need to decide whether the on-premises setup is to be your primary or secondary installation. If it’s to be a secondary setup you should have a relatively straightforward job of adding new system and application-level components in as secondaries to your cloud-based apps.

If you decide to flip things around you’ll have a more involved task of be shifting the primary apps over and redeploying the cloud setup as the secondary installation. Either way the happy news is that you’ve already gone through the non-trivial task of providing your office users with connectivity to the cloud installation, so hooking things up so they’re able to get to the private data centre, regardless of whether it’s the primary or the secondary, should be easier.

One further consideration with the choice of primary and secondary installations is the cost of data transfer. Shifting data out of a commercial cloud setup has a cost associated with it. Not a vast cost, I’ll grant you, but one that you do need to keep an eye on. Using Amazon as an example, moving a terabyte per month over the internet from the cloud setup to your private installation will cost you $90. That’s $900 for 10TB, or $7,800 for 100TB; even though the per-gigabyte cost tapers down, it doesn’t ever tail off at zero. What does this mean? Easy: if the cloud setup is the primary and it’s replicating application data to the private secondary, you’re paying a chunk of cash for it to do so.

While we’re on the subject of data transfer, you also need to figure out how you’re going to do it. In these modern times, it’s a relative doddle to set up the major cloud providers’ storage instances so you can access them externally via standard protocols such as NFS. Alternatively you can look to the major storage vendors, who will sell you a funky gateway to install in your private data centre and handle the cloud magic for you.

The next consideration is licensing, and there are two aspects here. First is the basic fact that you’ll need to buy operating system and/or application licences for your private setup sounds obvious but you may not ever have had to consider this if you were using a pay-as-you-go model with pre-configured cloud app servers. Second is that if you want to go for a clustered or active/passive application setup, you may need to revisit the versions you use on the cloud servers as well as buying licences for your private setup. Take SQL Server, for example: if you’re running Standard Edition you can implement basic two-node high availability, but if you want something more advanced you’ll need to upgrade to Enterprise Edition. Same with Oracle: if you want to enable Data Guard between sites that’ll need Enterprise Edition too.

Lastly, but by no means least, is your internal support team. They’ve probably spent a number of years fettling your cloud installation and fixing stuff when it broke, but their skillset will be at worst lacking and at best out of date when it comes to hosting, networking, hardware and hypervisor support.

Be prepared to invest in training so that you can be confident that the new kit you’re acquiring for your private data centre is properly supportable and hence properly supported. Yes, your typical infrastructure is easier to put together than it was a few years ago, but that doesn’t mean it’s trivial. And if you’re virtualising your private data centre which you should getting the hypervisor layer running and optimised will take time, effort and skill.

Going from a cloud-centric setup to a hybrid infrastructure isn’t rocket science, then which is no great surprise as any problem’s tractable if you design, plan and implement the solution properly. But going from cloud to hybrid has some differences from going from private to hybrid.

So you just need to think a bit before you do it.

Sponsored: The Joy and Pain of Buying IT – Have Your Say

Read more:
So you’re already in the cloud but need to come back down to Earth – The Register

Read More..

Chinese smartphone maker Xiaomi open to moving servers to India – Economic Times

NEW DELHI: Xiaomi said it was open to moving its servers to India subject to its cloud service provider partner setting up base in the country, amid increased government efforts to protect user data on mobile phones.

All our servers are sitting on AWS (Amazon Web Services) in Singapore and US. If AWS moves to India, we would be happy to work with them, Manu Kumar Jain, managing director of India operations, told ET, becoming the first overseas company to openly offer to move its servers to India.

Handset companies typically dont store data on their own servers but instead lease space on third party cloud service providers such as AWS, Microsoft and Google. While AWS and Microsoft have already set up their centres in India, Google has also announced setting up the same in the country to cater to a larger number of customers, especially those in the government or financial services industry since regulations in those sectors dont permit data to be transmitted outside the country.

We last evaluated this about 2-3 years ago when we were moving our servers (from China). At that time there was no significant presence (of AWS) and it was much more difficult to have it here, Jain said, when asked whether the company would move or add its servers in India, which appears to be the larger aim for the government intending to secure data. Jain did not say whether the company was already in talks with Amazon to move its servers to India.

He though added that from an internet speed perspective, the connectivity between India and Singapore was one of the best. We moved and thought it was pretty good. But if someone, AWS or equivalent, were to set up servers here (in India), we would be happy to work with them, he added.

The company, which sells about 4-5 million smartphones a quarter, said its devices were super-secure, no data was taken without use consent and the data that is taken is encrypted to the highest degree and that it cannot be decrypted even if the data is stolen.

Xiaomis views come at a time when the government is taking up security of phones with all companies, including Apple and Samsung, and scrutinising protection levels that all handset makers a large majority of which are Chinese were providing in India.

Another Chinese brand One-Plus has also said that it is prepared to respond to Indias data security and privacy concerns, since it sells the same smartphones in the US and Europe, where these concerns are already addressed.

Currently, we have not received direct request or requirement to set up servers or cloud storage in India. We are trying to get more clarity on that, One-Plus CEO Peter Lau told ET.

Amid the recent India-China standoff at Dokalam which has since been resolved the IT and electronics ministry has asked over 30 smartphone companies for protocols used by them to ensure the security of mobile phones in the country.

While the government is evaluating responses on security preparedness, it may well ask all those selling devices in India to have locally based servers.

Officials in the ministry said while the issue of apps sweeping up excessive user data was worrying, the broader issue remained that of the security of information that could be going to third parties outside the country, especially to China.

Read the original:
Chinese smartphone maker Xiaomi open to moving servers to India – Economic Times

Read More..

Nasa: Our demands for repeat presidential election – Daily Nation

By PATRICK LANG’ATMore by this Author

The Raila Odinga-led National Super Alliance (Nasa) has written to the electoral agency with 25 new demands that they say should be fulfilled in the October 17 fresh poll.

In a letter signed by Mr Odingas chief agent and Nasa co-principal Musalia Mudavadi, the opposition has demanded a full audit of the elections technology, full access to the servers, change of ballot printing firm, and the gazettement of new 290 constituency returning officers.

It is therefore inappropriate and foolhardy for the IEBC to embark on the planning of the fresh election without full compliance with the Orders of the Supreme Court in the redeployment of technology in the fresh presidential election. We therefore demand a full audit of technology in use in full compliance with the law, Mr Mudavadi said in the four-page letter.

The Nasa team has not only opposed the Independent Electoral and Boundaries Commission date for the repeat poll, it has also questioned why it narrowed the number of candidates to only Mr Odinga and President Uhuru Kenyatta.

Your interpretation that the election scheduled on the 17th October 2017 shall be in the style of a run-off contest of only two candidates is erroneous and unconstitutional. We also take reservation that you have not consulted the parties involved before making a determination on the date of the said election, Mr Mudavadi told IEBC Chairman Wafula Chebukati.

In the audit, Nasa has demanded a scrutiny they say should have full information on the ICT infrastructure, list of support partners and their respective Service Level Agreements, a detail of the firewall configuration, including ports configuration, as well as disclosure of all database transaction logs.

The team also wants a physical view and inspection of the IEBC servers, portal access to the cloud servers and IP addresses of all 20 servers; full access and copy of all servers and databases used by the IEBC; GPS coordinates of KIEMS; and Telkom and network structure with all service providers.

The opposition has also demanded the removal of senior personnel at the IEBC secretariat including Chief Executive Ezra Chiloba, his deputy Betty Nyabuto, James Muhati, the ICT director, Ms Immaculate Kassait, director of voter registration, Ms Praxedes Tororey, the head of legal team, with Mr Moses Kipkosgey being added to the earlier list.

Independence of the IEBC is not negotiable. Nasa coalition demands that fresh election should be administered by professional and non-partisan officials, Mr Mudavadi said.

“We demand that officials who are partisan or perceived as such should step aside and or be suspended during the planning and execution of the fresh election.”

Further, the coalition demanded a full audit of the Sh3.8 billion 45,000 voter identification and results transmission kits that were provided by French-based Safran Morpho.

We demand that Safran and Oracle provide full implementation information in relation to their involvement in the General Election held on 8th August 2017, said Mr Mudavadi.

The team has also demanded a review of the voter register, and the 40,883 polling stations.

To enhance transparency, Nasa said the following specific demands must be met:

Appoint and gazette returning officers not among the Constituency Election Coordinators in consultations with political parties and candidates.

Establishment of a technical monitoring committee with representatives of the main political parties, coalitions or candidates to oversee implementation of the technology in use.

Stop use of Al-Ghurair to print ballot papers and results declaration forms

All Forms 34Bs should be pre-printed indicating the names of polling stations in the constituency and names of candidates

Elections results to be announced at the Constituency level. Results sent electronically must be accompanied by corresponding statutory result declaration forms

Candidates agents should be part of receiving teams at the constituency and national tallying centers, and be allowed to confirm entries before transmission

Establish defined roles of the security agencies and eliminate undue influence by the provincial administration and other public officials

Jubilee Party chose the October 17 date, claim Nasa leaders.

Follow this link:
Nasa: Our demands for repeat presidential election – Daily Nation

Read More..

Sequoia, IDG to Invest in China Bitcoin Mining Giant – Bloomberg

Sequoia Capital and IDG Capital are investing in Beijing-based Bitmain Technologies Ltd., the worlds largest bitcoin mining organization, according to people familiar with the matter.

Bitmain is raising $50 million from several venture firms to boost its profile among mainstream investors, said one of the people, who asked not to be named because the matter is private. Sequoia and the other firms also plan to provide the company with more guidance on management, the people said.

Bitmain, which produces chips and machines for mining bitcoin and operates its own mining facilities, has benefited from the rise in the currencys market value, now about $75 billion. The startup told Bloomberg TV in August that its own valuation is in the billions and its weighing a possible initial public offering. Bitmain has said that its planning toproduce chips for artificial intelligence and invest in mining facilities in the U.S.

Bitmain, Sequoia and IDG didnt respond to email queries about the investment.

Inside Bitmains bitcoin mining facility in Ordos, Inner Mongolia.

Photographer: Qilai Shen/Bloomberg

The company led by founders Wu Jihan and Micree Zhan has been at the center of disputes over how to expand use of the cryptocurrency. Operating the largest mining collective — a network of computers that verify transactions made on the bitcoin distributed ledger– Wu has championed the idea of increasing block sizes of the network that were previously capped at 1 megabyte to enable faster transactions. Opponents have criticized the proposals for giving miners too much power and came up with alternative proposals.

A split occurred within the community in August, causing bitcoin to become two currencies– the original bitcoin and an offshoot called bitcoin cash.

As Bitcoin Risks Big Split, Along Comes Minor One: QuickTake Q&A

With assistance by Yuji Nakamura

Read the original post:
Sequoia, IDG to Invest in China Bitcoin Mining Giant – Bloomberg

Read More..

Bitcoin Price Drops By Over $250 as Crypto Markets Lose Billions – CoinDesk

Just two days after achieving a historic high of over $5,000 on September 2, bitcoin’s price has plummeted to below $4,400.

The notable sell-off the biggest in the crypto markets since July 15 began immediately after the record high of $5,013.91had been reached Saturday,and has continuedtoday, according to data from CoinDesk’s Bitcoin Price Index.

Starting the session at$4,631, the digital asset traded sideways for a time (with a high of $4,636), until around 07:00 UTC, when a sharp drop was observed taking bitcoin to a low of $4,345 for the session.

At press time, the price had recovereda tad to$4,367 a drop of 5.7 percent ($263) for the day so far.

The downwards movement reflects a general drop in the cryptocurrency markets.

A glance at CoinMarketCap data reveals that most digital assets are down today, with only a couple of cryptocurrencies showing in the green.

Amid losses across allthe top 10cryptocurrencies, notably, ethereum is down 14.53 percent, litecoin is down 15.37 percent, and monero has dropped 12 percent.

Looking at the markets as a whole, since reaching a record high of around $180 billion, the combined market cap for all cryptocurrencies is now $152 billion a drop of $28 billion.

Jumping BMX bikeimage via Shutterstock

The leader in blockchain news, CoinDesk is an independent media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. Have breaking news or a story tip to send to our journalists? Contact us at [emailprotected].

Continue reading here:
Bitcoin Price Drops By Over $250 as Crypto Markets Lose Billions – CoinDesk

Read More..

Scientists Just Found A Use For The Hashtag In Quantum Computing – Gizmodo Australia

Twitter’s hashtag just turned 10, and wouldn’t you know it, scientists just worked out a far better use for it – at the nanoscale.

See, it turns out that a criss-cross pattern of semiconducting nanowires is the perfect structure to help manipulate a particular type of quasiparticle into quantum bits.

This nano-hashtag structure at scales of a billionth of a metre should help the quasiparticles, known as Majorana fermions, be more easily formed into a qubit: the building blocks for quantum computers.

Majorana fermions are shown to be far more robust than existing qubit technology, and scientists developing this technology reckon it will lead to a new generation of quantum architecture that will end with a scalable, fault-tolerant universal quantum computer.

This kind of architecture is what Microsoft’s Station Q is looking for, and recently announced a multi-year partnership at the University of Sydney. One of the researchers involved with this new research is Dr Maja Cassidy, a senior researcher at Station Q Sydney, which is based at the University of Sydney Nanoscience Hub.

“Networks of nanowires are crucial to demonstrate how Majorana fermions interact through braiding,” said Dr Cassidy. “These will be a fundamental building block for topological quantum computation.”

Dr Cassidy worked on the research team while she was at TU Delft in the Netherlands.

[Nature]

Continue reading here:
Scientists Just Found A Use For The Hashtag In Quantum Computing – Gizmodo Australia

Read More..

The Future of AI: From Quantum Computing to the Internet of Things – Outer Places

Image credit: TriStar Pictures

Over the course of the hour, the discussion ranged from quantum computing and robotics to hacking and the ethics of creating sentient AI. Here are some of the highlights from the talk!

The Beginnings of AI: Early AI and Symbolic Reasoning

Joe Haldeman started off the discussion by talking about his experience with AI and symbolic reasoning courses during his college education, which covered philosophy, mathematics, and computer science. “I was studying AI before you guys were even born,” he joked.

He described writing out truth tables and learning quasi-algebraic logic, which allowed him to represent the “thought processes” of early computers. “I have faith in symbolic logic that I don’t have in natural language,” he said. “I won’t say it doesn’t lie, but when it lies, you can piece the truth out of it.”There’s something very elemental about writing out basic true-false equations for Haldeman: “I do know how to sit with a quasi-algebraic system and tease the truth out itit’s a feeling of power.”

Sentient AI and Quantum Computing

What gives a quantum computer its incredible, limit-breaking power is the qubit, which is analogous to the usual bits found in all computers, except that instead of a 1 and 0 state, a qubit can exist in a state that’s simultaneously 1 and 2 or neither.

This extra dimension allows for computational power that transcends current limits, opening the possibility for an artificial intelligence to grasp higher functions like self-awareness.

Hacking the Internet (of Things)

Paired with the introduction of hundreds of new, networked “smart” devices, from refrigerators to wearables to personal robots, the potential for hackers to take over a given device has grown exponentially in recent years.

Kelly cites the recent shutdown of a Facebook chatbot AIas an example of what happens when we lose control of AI: after trying to learn to communicate in English, the AI behind Facebook’s chat program decided that it would create its own, more efficient language, which was unintelligible to humans. When researchers realized what it was doing, they quickly shut it down.

The Promise of Sci-Fi and AI

As for the question of how close we are to realizing the kind of AI found in sci-fi and how safe we are from our darkest fears of robotic domination, Haldeman summed it up nicely: “This whole question shimmers between the uncomputable and the fictional. It’s a great place to start stories. These are existential stories-what is man? What are his computational limits?”

Stay tuned for more stories from Escape Velocity 2017!

Read more:
The Future of AI: From Quantum Computing to the Internet of Things – Outer Places

Read More..

Property Crowd Funding – The Magic of Modern Day Investing – HuffPost

As an entrepreneur and global communications specialist, Ive come across a diverse spectrum of investment avenues that are both lucrative and proactive.

However, the world of real estate has always intrigued me. Real estate has been booming around the world, particularly in the UK, with new housing, apartment and condo complexes being built at a phenomenal pace.

This provides a great opportunity for investment when it comes to making your money work for you.

During a recent visit to London, England, I came across an entrepreneur that opened my eyes to the world of Property Crowdfunding.

Abdullah Iqbal, Co-Founder of the Knightsbridge based start-up PropTech Crowd, reveals: Property crowdfunding is a relatively new form of investment which allows multiple investors to come together to invest in specific properties. As each has a small share, the cost of entry is significantly lower than it would be if they decided to invest alone

Belonging to an ethnic Muslim background, I learned that Abdullah joined his father who has been involved in the familys property business for well over three decades.

However, it was during a trip to his native Pakistan that his father thought of introducing an interest-free property crowdfunding model that would also be attractive to the thriving global Muslim community, specifically in the UK.

Abdullah Iqbal, Co-Founder at PropTech Crowd.

While there existed property crowdfunding companies already, Abdullah and his dad saw an obvious vacuum in the market. None of the property crowdfunding platforms were Shariah compliant at the time, due to them being involved with interest. Our motivation was to take the banks out of the equation, enabling investors to have shares and democratising the property market for everyone, while conforming to the Islamic prohibition of interest, emphasises Abdullah.

The companys core mission is to revolutionise property investment through innovative crowdfunding technology, allowing everyday investors to access high-ROI opportunities that they may have been priced out of in the past.

As I learned intriguingly, some of the key benefits of property crowdfunding are:

As an individual with limited finances, I asked how is this relevant to someone, or even beneficial in the short or long run of things?

Abdullah adds, Property crowdfunding is opening up property investment, allowing anyone to access benefits that were previously only available to high-net- worth individuals and asset managers.

When investing in property, one of the biggest facets to look at is obviously ones return on investment.

Would you invest in something with a high return on investment (ROI)? Of course you would. Anyone would! Thats a no-brainer. And Abdullah elaborates, According to the Office for National Statistics, UK house prices have risen by 31%. This is far superior to the return you get from savings accounts.

The model is applicable to both Muslims and Non Muslims. There are no bank involved, no loans with interest, and one receives full voting and financial rights with your investment.

So how is the Shariah compliance is ensured?

I learned that Mufti Abdul Kader, a renowned Islamic scholar and expert in Islamic finance, is a Shariah Compliance Advisor at PropTech Crowd. His duties entail making sure that all elements of the business are Shariah compliant, visibly and consistently.

When I asked Abdullah about ownership as a landlord, I was surprised at the answer. Longer-term investments receive a yield the rent in addition to any capital gains you may receive when the crowd sells the property. You get the benefits of being a landlord without the huge start-up costs and with all of the administration done for you, reveals Abdullah.

Where does the company go from here?

Abdullah smiles gently and replies, Our 5-year vision is for our investors to have access to properties across major global cities. We want to open doors up in various countries for ordinary people to invest and own property anywhere. Owning property shouldnt be difficult and we intend on creating a model that facilitates it in the most effective and beneficial way.

The company is also actively cross-promoting with established platforms such as Ethis Crowd, the Worlds first Islamic Real Estate Crowdfunding platform headquartered in Singapore and Islamic Banker, a marketplace for responsible investments.

He elaborates, The purpose of Crowdfunding is to get together and accomplish something, whether it is a social cause, to support a start-up, or invest in property. As a platform, we find collaboration to be the essence of success. Our motives and vision are very much aligned with Ethis Crowd and Islamic Banker, so it is clear that if there is synergy, we should explore it.

Conclusively, property crowdfunding is an exciting and rewarding method of investing in the property market. Since human beings will always require shelter as a basic necessity regardless of how the economy is the real estate market is incessantly glorious and an ever green form of investment. As an entrepreneur, we are told to think forward.

Read this article:
Property Crowd Funding – The Magic of Modern Day Investing – HuffPost

Read More..

JRM Altcoin Update – Quit chasing the headlines! – YouTube

http://www.jenkinsrm.com/crypto-coin-…

Our Institutional model brought to the independent trader and investor.

Whether you’re an experienced trader or an absolute beginner, we can help you reach your trading goals. Enjoy the trading lifestyle you envision.

Take your trading to a new level. Gain from the experience and knowledge of professional trading mentors, and never trade alone again. At Jenkins Research Management, our process cuts through all the haze and offers a realistic, time proven trading method based on the many years of Jason’s institutional trading experience. We don’t offer a system or any sort of “magic indicators”. Our approach to trading is a process, top down and consistent. We are traders first while also offering research to institutional clients.

We offer a live trading room, where I call my trades live with the team. I share my charts, entires, and exits via webinar.

Learn more here: http://www.jenkinsrm.com/jrm-full-acc…

Also check out my new crypto trading team here: http://www.jenkinsrm.com/crypto-coin-…

Visit link:
JRM Altcoin Update – Quit chasing the headlines! – YouTube

Read More..