A Complete Beginner's Guide To Bitcoin In 2018

When you dig into the details of Bitcoin, it’s almost an unbelievable tale about how to create money. Although it seems like fiction, it’s actually the best-known version of digital currency in use today. To help you wrap your head around what it is, what it does and how to earn Bitcoins, I pulled together this complete beginner’s guide to Bitcoin.

Before we go any further I just want to reiterate that investing in cryptocoins or tokens is highly speculative and the market is largely unregulated. Anyone considering it should be prepared to lose their entire investment.


A bit of bitcoin history

Bitcoin was the first established cryptocurrency - a digital asset that is secured with cryptography and can be exchanged like currency. Other versions of cryptocurrency had been launched but never fully developed when Bitcoin became available to the public in 2009. The anonymous Satoshi Nakamoto - possibly an individual or a group whose real identity is still unknown - is behind the development of Bitcoin who stated the goal of the technology was to create “a new electronic cash system” that was “completely decentralized with no server or central authority.” In 2010, someone decided to sell their Bitcoins for the first time to purchase two pizzas for 10,000 Bitcoins. I hope the pizza was good, because if that person would have held onto those Bitcoins, they would be worth more than $100 million today. In 2011, Nakamoto shared the source code and domains with the Bitcoin community and hasn’t been heard from again.


What is Bitcoin, really?

Bitcoin is a digital currency, so there are no coins to mint or bills to print. There is not a government, financial institution or any other authority that controls it, so it’s decentralized. The owners who have Bitcoins in the system are anonymous - there are no account numbers, names, social security numbers or any other identifying features that connect Bitcoins to its owners. Bitcoin uses blockchain technology and encryption keys to connect buyers and sellers. And, just like diamonds or gold, a Bitcoin gets “mined.”


How do you “mine” Bitcoins?

People - or more accurately extremely powerful, energy-intense computers - “mine” Bitcoins to make more of them. There are currently about 16 million Bitcoins in existence, and that leaves only 5 more million available to mine because Bitcoins developers capped the quantity to 21 million. Ultimately, each Bitcoin can be divided into smaller parts with the smallest fraction being one hundred millionth of a Bitcoin called a “Satoshi,” after the founder Nakamoto. The mining process involves computers solving an extremely challenging mathematical problem that progressively gets harder over time. Every time a problem is solved, one block of the Bitcoin is processed and the miner gets a new Bitcoin. A user establishes an Bitcoin address to receive the Bitcoins they mine; sort of like a virtual mailbox with a string of 27-34 numbers and letters. Unlike a mailbox, the user’s identity isn’t attached to it.


How are Bitcoins used?

In addition to mining Bitcoins, there are other ways to earn Bitcoins. First, you can accept Bitcoins as a means of payment for goods or services. Setting up your Bitcoin wallet is a simple as setting up a PayPal account and it’s the way you store, keep track of and spend your digital money. They are free and available through a provider such as Coinbase. While this might take more time than it’s worth, there are websites that will pay you in Bitcoins for completing certain tasks. Once you’ve earned Bitcoins, there are ways to lend them out and earn interest. There are even ways to earn Bitcoins through trading and recently Bitcoin futures were launched as a legitimate asset class. In addition, you can trade your regular currency for Bitcoins at Bitcoin exchanges, the largest one being Japan-based Mt. Gox that handles 70 percent of all Bitcoin transactions. There are more than 100,000 merchants who accept Bitcoin for payment for everything from gift cards to pizza and even Overstock.com accepts it.


What are the risks?

There’s risk as well as great opportunity with Bitcoin. While it has been appealing to criminals due to its anonymity and lack of regulation, there are lots of benefits to all of us if you’re willing to accept some risk to jump in to the Bitcoin marketplace. Since there is no governing body, it can be difficult to resolve issues if Bitcoins get stolen or lost. In 2014 Mt. Gox went offline, and 850,000 Bitcoins were never recovered. Once a transaction hits the blockchain it’s final. Since Bitcoin is relatively new, there are still a lot of unknowns and its value is very volatile and can change significantly daily.


So, the jury’s still out if Bitcoin will accomplish what its proponents predict, the replacement of government-controlled, centralized money. I fully expect 2018 to give us even more insight about the future of Bitcoin as the technology continues to grow and mature.



https://www.forbes.com/sites/bernardmarr/2018/01/17/a-complete-beginners-guide-to-bitcoin-in-2018/#299e852d4418


How Thermal Cameras Work

Our eyes work by seeing contrast between objects that are illuminated by either the sun or another form of light.  How thermal cameras work is by “seeing” heat energy from objects.  All objects – living or not – have heat energy that thermal cameras use to detect an image. This is why thermal cameras can operate at all times, even in complete darkness.

Because thermal cameras work by “seeing” heat rather than reflected light, thermal images look very different than what’s seen by a visible camera or the eye.  In order to present heat in a format appropriate for human vision, thermal cameras convert the temperature of objects into shades of gray which are darker or lighter than the background. On a cold day a person stands out as lighter because they are hotter than the background. On a hot day a person stands out as darker because they are cooler than the background.

Outdoor challenges can impact how thermal cameras work

For these reasons, thermal cameras have become a good choice as a sensor for “seeing in the dark” because at night background objects tend to be cooler than a person at 98.6 degrees. Under ideal conditions, people are well emphasized at night because they appear brighter than the background and stand out, even in zero light.

However, outdoor security conditions are rarely “ideal”, especially during the day when darker objects absorb the sun’s energy and heat up, an effect known as Thermal Loading. When objects in the scene become uniformly hot in any given area, many cameras have difficulty mapping the narrow range of temperature differences into a useful image. The result is an image with large areas that look “whited out” or “grayed out” and undefined. This makes it difficult to see what is happening in the scene, and it makes it difficult for smart thermal cameras to automatically detect intruders accurately.


The capture at right shows a daylight image from a thermal camera which cannot effectively compensate for white-out. Details such as the power lines, pavement, and other objects have become impossible to discern due to the effect of thermal loading. It’s even difficult to tell that this is a daytime image.

Lack of image clarity can reduce security effectiveness. Security personnel who have to view blurry, undefined video even on a single monitor can become fatigued and confused by images that are not as intuitive as they would be with daylight cameras, while on-board video analytics will have a more difficult time detecting intruders.

Video Processing and Thermal Cameras

Thermal imagery is very rich in data, sensing small temperature variations down to 1/20th of a degree. Thermal cameras must convert these fine temperature variations – representing 16,384 shades of gray – into about 250 gray scales to more closely match the capability of human vision to decipher shades of gray. The image below shows the eye’s difficulty distinguishing between close levels of gray. The top row shows six levels of gray which the eye can see.  The bottom row shows sixteen shades of gray – you can see how it is increasingly difficult to distinguish where the shades transition from one block to the next. Consider the fact that a thermal imager has 16,000 shades of gray, over 1000 times more than show in the lower bar graph, and the magnitude of the problem becomes clearer.

In the past, most thermal cameras converted this data in a simplistic way by mapping gross areas together that are close in temperature. This is why thermal images often look blurry, lack detail and conceal intruders, while the analytics would often misdetect intruders entirely.

New cameras with a high-level of image processing can emphasize small variations between objects and the background to exaggerate the fine details and present a clearer image in contrast to other image features, while automatically detecting intruders accurately, every day, every night, under all outdoor conditions.


http://www.sightlogix.com/how-thermal-cameras-work/


Why is Cybersecurity Important?

The answer to the question “Why is cybersecurity important?” might seem simple enough, considering the number of cyberthreats is growing in a big way.  As a result, it’s no surprise that it’s an inquiry getting asked more frequently–and the knowledge behind it is something that investors in the sector should be aware of.

For starters, a proliferation of cyberattacks is causing increasing damage to companies, governments and individuals. Take the WannaCry attacks that happened in May 2017 as a significant example: the ransomware inscribed itself on roughly 300,000 computers and other digital software in over 150 countries, later called the “largest such cyber assault of its kind.”  Putting it simply, organizations need to respond to this increased threat by adopting strict cybersecurity measures.

In this article, the Investing News Network (INN) breaks down the three main answers to the question: “why is cyber security important?” Together, the expanding number of cyber attacks, the increasing severity of these assaults and the amount of money companies are spending on cyber security illustrate why it is such an important market.

Why is cybersecurity important? Increasing threats

As noted above, the number of cybersecurity attacks is increasing every year. From the period between 2013-2015, it was reported that the cost of cybercrimes quadrupled–with those numbers falling anywhere between $400 billion and as high as $500 billion during that time.

In a Cybersecurity Ventures report on cybercrimes, the firm projects the cost of cyberthreats to rise to $6 trillion annually by 2021, which includes everything from damage and destruction of data, stolen money, lost productivity, theft of intellectual property, theft of personal and financial data, embezzlement, fraud, post -attack disruption to businesses, forensic investigation, restoration and deleted hacked data and systems, to name a few.  What investors might not know is the rising threat of cyberattacks on medical devices–which is expected to reach $101 billion by 2018.

On that note, in an interview with INN, Dr. Alissa Johnson–CSIO of Xerox and former Deputy CIO at the White House, said that in order to prevent cybersecurity attacks,  it’s important to remember the basics.

“Security is not just the responsibility of the security team, but as we’re learning and realizing–especially with all the spear phishing attacks–security is everyone’s responsibility,” she said.

Why is cybersecurity important? Severity of attacks

However, it isn’t just the number of cybersecurity attacks that is increasing. The degree of these attacks is on the rise as well. According to PwC’s report, these attacks are “becoming progressively destructive and target a broadening array of information and attack vectors.”

Politicians are at risk. Obama’s administration proposed a $19 billion budget for cybersecurity. Hillary Clinton’s private emails became front page news in the midst of her presidential campaign, underscoring the importance of a strong cybersecurity policy.

In May 2017, current President Donald Trump signed an executive order with a focus on improving cybersecurity in the US–particularly for the country’s infrastructure systems and federal information technology networks.

On that that note, in late August eight out of 28 members of the National Infrastructure Advisory Council, who are oversee the country’s response to cyberthreats, resigned.

According to BNA, those that resigned said, the Trump administration had paid “insufficient attention to the growing threats to cybersecurity of critical systems.”

BNA had previously reported that the “Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure” gives the heads of federal agencies responsibility for managing cyber threats–which still holds true, although is slightly eerie now, given recent the recent events.

Why is cybersecurity important?  Future outlook

Looking ahead, PwC’s Global State of Information Security Survey 2017 states that 59 percent of GSISS respondents say “digitization of their business ecosystems has impacted” security budgets. As these statistics show, cybersecurity is a very important area worthy of commitment, and companies are responding accordingly.

Struggle between increased needs and limited funding is however characteristic of the cybersecurity industry. Hopefully, more and more companies, government departments and organizations are recognizing the importance of cybersecurity, and are allocating funds accordingly.



https://investingnews.com/daily/tech-investing/cybersecurity-investing/why-is-cybersecurity-important/

5 Myths About Artificial Intelligence (AI) You Must Stop Believing

Very few subjects in science and technology are causing as much excitement right now as artificial intelligence (AI). In a lot of cases this is good reason, as some of the world’s brightest minds have said that it’s potential to revolutionize all aspects of our lives is unprecedented.

On the other hand, as with anything new, there are certainly snake-oil salesmen looking to make a quick buck on the basis of promises which can’t (yet) be truly met. And there are others, often with vested interests, with plenty of motive for spreading fear and distrust.

So here is a run-through of some basic misconceptions, and frequently peddled mistruths, which often come up when the subject is discussed, as well as reasons why you shouldn’t necessarily buy into them.

AI is going to replace all jobs

It’s certainly true that the advent of AI and automation has the potential to seriously disrupt labor – and in many situations it is already doing just that. However, seeing this as a straightforward transfer of labor from humans to machines is a vast over-simplification.

Previous industrial revolutions have certainly led to transformation of the employment landscape, such as the mass shift from agricultural work to factories during the nineteenth century. The number of jobs (adjusted for the rapid growth in population) has generally stayed consistent though. And despite what doom-mongers have said there’s very little actual evidence to suggest that mass unemployment or widespread redundancy of human workforces is likely. In fact, it is just as possible that a more productive economy, brought about by the increased efficiency and reduction of waste that automation promises, will give us more options for spending our time on productive, income-generating pursuits.

In the short-term, employers are generally looking at AI technology as a method of augmenting human workforces, and enabling them to work in newer and smarter ways.

Only low-skilled and manual workers will be replaced by AI and automation

This is certainly a fallacy. Already, AI-equipped robots and machinery are carrying out work generally reserved for the most highly trained and professional members of society, such as doctors and lawyers. True, a lot of their focus has been on reducing the “drudgery” of day-to-day aspects of the work. For example, in the legal field, AI is used to scan thousands of documents at lightning speed, drawing out the points which may be relevant in an ongoing case. In medicine, machine learning algorithms assess images such as scans and x-rays, looking for early warning signs of disease, which they are proving highly competent at spotting. Both fields, however, as well as many other professions, involve a combination of routine, though technically complex, procedures – which are likely to be taken up by machines – as well as “human touch” procedures. For a lawyer this could be presenting arguments in court in a way that will convince a jury, and in medicine, it could be breaking news in the most considerate and helpful way. These aspects of the job are less likely to be automated, but members of their respective professions could find they have more time for them – and therefore become more competent at them – if mundane drudgery is routinely automated.

Super-intelligent computers will become better than humans at doing anything we can do

Broadly speaking, AI applications are split into two groups – specialized and generalized. Specialized AIs – ones focused on performing one job, or working in one field, and becoming increasingly good at it – are a fact of life today – the legal and medical applications mentioned above are good examples.

Generalized AIs on the other hand – those which are capable of applying themselves to a number of different tasks, just as human or natural intelligences are – are somewhat further off. This is why although we may regularly come across AIs which are better than humans at one particular task, it is likely to be a while before we come face-to-face with robots in the mould of Star Trek’s Data –essentially super-humans who can beat us at pretty much anything.

Bernard Marr is a best-selling author & keynote speaker on business, technology and big data. His new book is Data Strategy. To read his future posts simply join his network here.

Artificial intelligence will quickly overtake and outpace human intelligence

This is a misconception brought about by picturing intelligence as a linear scale – for example, from one to 10 – imagining that perhaps animals score at the lower end, humans at the higher end, and with super-smart machines at the top of the scale.

In reality intelligence is measured in many different dimensions. In some of them (for example speed of calculations or capacity for recall) computers already far outpace us, while in others, such as creative ability, emotional intelligence (such as empathy) and strategic thinking, they are still nowhere near and aren’t likely to be any time soon.

AI will lead to the destruction of enslavement of the human race by superior robotic beings

This one is obviously out of any number of sci-fi scenarios – The Terminator and The Matrix are probably the most frequently cited! However, some voices which have proven themselves to be worth listening to in the past – such as physicist Stephen Hawking and tech entrepreneur Elon Musk – have made it very clear they believe the danger is real.


The fact is though, that notwithstanding the distant future, where indeed anything is possible, a great number of boundaries would have to be broken down, and allowances made by society, before we would be in a position where this would be possible. Right now, it’s highly unlikely anyone would think about building or deploying an autonomous machine with the potential to “make up its mind” to hurt and turn against its human creators. Although drones and security robots designed to detect and prevent threats, and even take autonomous action to neutralize them, have been developed, they have yet to be deployed and doing so is likely to provoke widespread public condemnation. The hypothetical scenario tends to be that robots either develop self-preservation instincts, or re-interpret commands to protect or preserve human life to mean that humans should be taken under robotic control. As it is unlikely that anyone would build machines with the facilities to carry out these actions autonomously, this is unlikely to be an immediate problem. Could it happen in the future? It’s a possibility, but if you’re going to worry about science fiction threats, then it’s just as likely that invading aliens will get to us first.



https://www.forbes.com/sites/bernardmarr/2017/10/03/5-myths-about-artificial-intelligence-ai-you-must-stop-believing/#36a6fa0b2739

3 Ways to Defeat DDoS Attacks

In 2012, a number of DDoS attacks hit Bank of America, JPMorgan Chase, Wells Fargo, U.S. Bank and PNC Bank. These attacks have since spread across most industries from government agencies to local schools and are showing an almost yearly evolution, with the most recent focus being the Internet of Things (IoT).

In 2016, compromised cameras, printers, DVRs and other IoT appliances were used in a large attack on Dyn that took down major websites including Amazon, Twitter, Netflix, Etsy and Spotify.

Inside Distributed Denial-of-Service Threats

Although these large attacks dominate the headlines, they’re not what most enterprises will deal with day to day. The most common attacks are in the range of 20 to 30 Gbps or less, while larger attacks have been reported at 1.2 tbps.


Creating DDoS Defense

Security technology is becoming more sophisticated, but so are hackers, which means attacks can be much more difficult to mitigate now than in the past. Enterprises must be knowledgeable and prepared with mitigation techniques as the attacks continue to evolve.


DDoS mitigation comes in three models:

Scrubbing Centers

The most common DDoS mitigation option for enterprises is to buy access to a scrubbing center service. During an attack, traffic is redirected to the security provider’s network, where the bad traffic is “scrubbed out” and only good traffic is returned to the customer. This option is good for multi-ISP environments and can be used to counter both volumetric and application-based attacks. For added protection, some providers can actually place a device in your data center, but this is not as cost-effective as the cloud-based option.


ISP- Clean Pipes Approach

With the rise of DDoS attacks, many ISPs have started their own scrubbing centers internally, and for a premium will monitor and mitigate attacks on their customers’ websites. In this scenario, ISPs operate as a one-stop-shop for bandwidth, hosting and DDoS mitigation. But some ISPs are more experienced at this than others, so customers must be sure to thoroughly test and research the quality of the service offered by their ISPs.


Content Delivery Network Approach

The distributed nature of content delivery networks (CDNs) means that websites live globally on multiple servers versus one origin server, making them difficult to take down. Large CDNs may have over 100,000 servers distributing or caching web content all over the world. However, CDN-based mitigation is really only a good option for enterprises that require core CDN functionality, as porting content to a CDN can be a time-intensive project.



https://www.forbes.com/sites/gartnergroup/2017/08/28/3-ways-to-defeat-ddos-attacks/#a981226da78f

15 Trends That Will Transform The Way We Live And Work

Great change is underfoot in the places that we live, and also in the spaces where we work. While demographic trends and a mounting thirst for self-actualization plays a big part in these shifts, technological advances are the one factor that is accelerating this change.

In their book Spaces for Innovation: The Design and Science of Inspiring Environments, Kursty Groves and Oliver Marlow shares their discoveries on the impact that a physical space can have on workplace behavior. Their journey takes them through the offices of tech behemoths Airbnb, Microsoft, and others -- and gives an illuminating look at the trends that are rapidly shaping the way that we live and work.

1. The mistrust of institutions. Thanks to the public debt crisis and a heightening mistrust in big banks and corporations, the book points out that only 22% of Americans say that they trust their country’s financial system.

2. Big changes in the corporate world. Between 1983 and 2011, the 50 companies that made up 90% of American media have fallen to 6. This has to do with the consolidation of companies and also the illusion of choice.

3. Crisis in the natural world. Today, 50% of the world’s original forests have all bit disappeared. As tropical forests are home to a minimum of 50% of species, clearing out 17 million hectares of these forests every year are sure to do irreparable damage to our living environments.

4. The proliferation of online identity. Otherwise known as a “second life for all,” social media has created a new layer of identity. Facebook alone has 1.6 billion active monthly users in 2016, which is up 15% from 2015.

5. The generation conundrum. By 2025, 75% of workers around the world will be Millennials. An interesting thing to point out is that Gen-Z, the generation born after 1998 will be the very first post-Internet generation.

6. The real-estate problem. The issue that everyone is facing in regards to real estate these days is there is “nowhere to live, nowhere to work.” The book notes that 50% of Londoners will be renting by 2025, up from 40% in 2000.

7. Disruption in manufacturing and supply chain. The global 3D printing industry is set to skyrocket to US$12.8 billion by 2018, up from US$3.07 billion in 2013. This figure is set to surpass US$21 billion in revenue by 2020, and is supporting the trend of rapid making and customization.

8. The doing away of “single-use architecture.” As many are beginning to adapt to the “in my own place, on my own time” regimen, what will become of traditional brick-and-mortar environments? The book suggests that there will be a hybridization of environments as more take up the possibility of remote work.

9. It’s the end of the office as we know it. Thanks to advancements in robotics, AI, and genetics, over five million jobs will be lost by 2020. Two-thirds of these job losses will be in the administrative and office-related roles.

10. Self-actualization a priority. As people move towards purpose and self-worth, they’ll also want to find work that is both fulfilling and meaningful. Today, only 13% of employees globally report being engaged and emotionally invested in their work.

11. Deeper understanding of why we need creativity and flow. Workers are starting to be more in tune with the conditions needed for psychological happiness. According to Prof. M. Csikszentmihalyi, “Enjoyment appears at the boundary between boredom and anxiety, when the challenges are just balanced with the person’s capacity to act.”

12. Increased need for collaboration to stay engaged. The book points out that collaboration and teamwork are growing in importance. When working in a team, 71% of people reported feeling creative, 62% citing an increase in productivity, and 90% feeling more confident when coworking.

13. Urban explosion. Today, 54% of the world’s population are living in cities, which is expected to reach 66% by 2050. There is also a clustering of cities to create megacities (cities with more than 10 million). By 2030, there will be 41 megacities around the world, pitted against just 10 back in 1990.

14. Rise of the gig economy. Who can say who is the boss these days? By 2020, 40% of the American workforce will be working as a freelancer or independent contractor. This helps companies save money on things like benefits, but also helps people chart their own path to work-life autonomy.

15. Collaborative consumption. Sharing economy players, like Airbnb, continue to impact the way we live as more people tune into the digital nomad lifestyle. According to the book, the consumer peer-to-peer rental market is worth US$26 billion today.



https://www.forbes.com/sites/irisleung/2017/07/29/15-trends-that-will-transform-the-way-we-live-and-work/#6e5cb8fb3cdf