Friday, 18 December 2015

8 Linux predictions for 2016

 Looking ahead to 2016, I see big things for ChromeOS, Android, and even Microsoft in the Linux world.

Lunduke looks ahead for Linux
As 2015 comes to a close, the time has arrived to make predictions for what will happen in the Linux (and broader Free and Open Source Software) world in the year ahead. Will all of my predictions actually come true in 2016? Who knows? But I’m making them anyway!

We still won’t be using Wayland.
That’s right. I’m going on the record and saying that, when 2016 ends, we still won’t be using Wayland. Oh, sure. Maybe the odd Linux distribution here or there might be shipping with Wayland enabled. But the big distros? Xorg, baby!

systemd's scope will expand.
systemd’s scope will grow to include an office suite and web browser. Just kidding. But not really. The little init-system-that-could is going to keep on expanding to include more and more functionality over the next year. There will be much gnashing of teeth in the Linux community.

Canonical will pull away from phones.
During 2016, Canonical will scale back, and possibly cease entirely, development of the phone version of Ubuntu. At the same time, the company will renew a focus on the Ubuntu desktop and server. I know. I know. I hear many of you yelling already. But I think this is the route Canonical will take to achieve success (financial and otherwise) in the market.

Android will gain significant desktop-centric functionality.
The next major release in 2016 will have new functionality that will allow Android devices to behave in a way similar to modern desktop operating systems. Most notably: Applications in movable, overlapping windows. We will then see an uptick in new Android-powered laptops (and tablets shipping with keyboards).

ChromeOS will gain full access to the Google Play store.
While we’re talking about Google, in 2016 ChromeOS will gain the ability to install and run Android applications directly from the Google Play store. While the available application selection may not be quite as extensive as what is available on, say, an Android tablet – it will be similar to what is currently available for Android TV (read: a small, but growing, subset of apps) – it will still provide a huge boost to what ChromeOS devices can do outside of the browser.

A new, Linux-based phone OS will appear.
Despite – or, perhaps, because of – the problems with Linux-based phone systems (non-Android ones, at any rate) at the end of 2015, the next year will see a new Linux-based system built for phones make some big waves. Who will it be? I haven’t the foggiest. But I’m confident some company (or organization) is going to surprise us in this area.

elementary, openSUSE, Fedora will gain market share.
The Linux world can be a crazy place sometimes. One minute, Distro A is on top of the world, the next minute Distro B comes out of nowhere to dominate the landscape. I think the biggest market share gains (from distributions that exist at the close of 2015) will be from elementary OS, openSUSE, and Fedora. What sort of gains are we talking about? I have no clue. But, mark my words, it will be noteworthy.

Microsoft will increase its Open Source activity.
In 2016, Microsoft will step up its level of activity in the Free and Open Source world in a big way. Additional code will be released under Free (or, at least, Open Source) licenses. Linux will be something they talk about more and more. We will see Microsoft have a bigger and louder presence at Linux and FOSS-related conferences. And the Linux community will grow increasingly accepting of it. It will be weird.

Thursday, 10 December 2015

CompTIA, Cisco, Microsoft & other big enterprise IT firms miss Best Places to Work cut

Airbnb tops Glassdoor's Best Places to Work in 2016 rankings

It’s not that the biggest names in enterprise IT and networking aren’t good places to work, according to employees submitting reviews to jobs and career marketplace Glassdoor. It’s just that they aren’t “Amazing!” or “Great!” places to be employed, according to Glassdoor’s list of the 50 Best Places to Work in 2016.

When approached by Glassdoor about this list, we weren’t surprised to see a buzzy young company like Airbnb atop the rankings, dethroning Google, which fell from No. 1 last year to No. 8 this time around. The likes of Hubspot, Facebook, LinkedIn and Zillow in the Top 10 also didn’t come as surprises.

But the very top companies weren’t all fresh faces: 40-plus-year-old Bain & Co. came in second.

So why didn’t some of the biggest names in enterprise networking and IT make the top 50? (Rankings are based on a proprietary algorithm that crunched information from 1.6 million anonymous reviews.)

Well, first, consider that the numbers across many of these companies are pretty darn close. The 50th company in the rankings, SolarCity, had a rating of 3.9 stars, whereas Microsoft, for example, has 3.8 and Cisco has a 3.7.

What passes for fun these days at Google, #8 on Glassdoor's Best Places to Work 2016 list

A Glassdoor spokeswoman says that for Microsoft, “What seems to make the difference based on the data we're seeing is Microsoft's reviews are more subdued, and use the word ‘good’ a lot. For example: ‘Good salary and benefits’ and ‘Good work/life balance’ and ‘Good environment if you are in a good team with good management’.”

Compare that to the sort of language used in Airbnb reviews ("Amazing people, vibrant workplace, and an unbeatable culture" and "the founders are great people and I believe they have the best intentions for the company, the employees, and our community.")

Common themes among the top-rated companies included employees feeling valued, unique cultures aligned with mission, smart colleagues, and great perks/benefits.

All this isn’t to say enterprise IT companies didn’t show up in the Top 50. In fact, #3 Guidewire makes back-end software for insurance companies – so, an enterprise IT company, but one you might not know if you’re not in that market. More familiar enterprise IT companies such as Akamai (#31), Salesforce (#32), F5 Networks (#33), Workday (#35) and Red Hat (#37) are all on the list, and then there are those big consumer AND enterprise outfits like Apple (#25).

Looking back at Glassdoor’s recent rankings – it has compiled this list for 8 years now – enterprise companies (depending on how you define them) are actually making a slightly stronger showing than in years past. So, it’s not like people working for Airbnbs and other cool companies are having all the fun.

Monday, 30 November 2015

Microsoft acknowledges bug led to Windows 10 November upgrade stoppage

Restores 1511 to download site, restarts Windows Update push

Microsoft has restored access to Windows 10's November upgrade from its download center, saying that it pulled the upgrade because of a bug.

"Recently we learned of an issue that could have impacted an extremely small number of people who had already installed Windows 10 and applied the November update," a Microsoft spokesman said in a Tuesday statement. "It will not impact future installs of the November update, which is available today."

Microsoft yanked the upgrade from the download website -- and stopped serving it to Windows 10 users via Windows Update -- last week. According to the company, the upgrade had reverted four preferences within the operating system to the original "on" default settings.

"We will restore their settings over the coming days and we apologize for the inconvenience," the spokesman added.

The settings that were changed included two in Windows 10's privacy section -- one that lets the user's advertiser ID to be tracked across multiple apps, another that enables an anti-phishing filter for apps that display Web content -- and a second pair that synchronized devices and allowed various first-party apps to run in the background to, for instance, provide notifications.

Microsoft provided some information on the settings bug in a support document, and also rolled out a new cumulative update, the only kind for Windows 10.

While the bug may seem minor -- especially in the context of the roll call of louder complaints about the November upgrade on Microsoft's own support forums -- the company may have been ultra-sensitive to the privacy settings snafu, considering that the firm has been manhandled by critics over what they saw as a significant uptick in intrusiveness. Those who had turned off the advertiser ID tracking, for example, would certainly have been upset to discover that it had been switched back on after the upgrade.

After fixing the problem, Microsoft restored the upgrade to the download center, where current Windows 10 users can generate installation media -- usually a USB thumb drive, but alternately a DVD -- with the Media Creation Tool (MCT). Many have been using the MCT to cut the line for the upgrade, normally served through the Windows Update service, and skip the wait as Microsoft slowly rolls it out in its now-familiar staggered fashion.

Computerworld confirmed that the MCT now downloads the November upgrade, which Microsoft identifies as both 1511 -- a nod to the November 2015 release date -- and build 10586, rather than the original July 29 code that it had reverted to last week.

The gaffe with the November upgrade could be seen as a setback for Microsoft's strategy to convince customers that it can provide regular upgrades to Windows 10 two or three times a year, and more importantly, prove that it can do so with high-quality code that requires less testing than prior editions.

After the upgrade's Nov. 12 release, but before it was pulled from distribution, Gartner analyst Steve Kleynhans had called 1511 a milestone in Microsoft's scheme. "This is a proof case for the ongoing update process," Kleynhans said in a Nov. 13 interview. "It's only the first data point, of course, but having delivered it, more or less on time, is a pretty good sign."

Now? Maybe not so much.

Tuesday, 24 November 2015

74-678 Designing and Providing Microsoft Volume Licensing Solutions to Large Organisations

QUESTION 1
A Datum wants to extend its on-premises server farm by deploying SQL Server to virtual machines in Microsoft Azure for a short-term development project.
How should you recommend that Contoso license the deployment?

A. Purchase virtual machines that run Windows Server through Azure and assign existing SQL Server licenses by using License Mobility within Server Farms.
B. Purchase virtual machines that run SQL Server through Azure.
C. Purchase virtual machines that run Windows Server through Azure and assign existing SQL Server licenses by using License Mobility through Software Assurance (SA).
D. Use MSDN licenses for Windows Server virtual machines and for SQL Server.

Answer: C

Explanation: * With License Mobility through Software Assurance, you can deploy certain server application licenses purchased under your Volume Licensing agreement in an
Authorized Mobility Partner’s datacenter. * Use License Mobility to:
Extend the value of your server application license by deploying them on-premises or in the cloud.
Take advantage of the lowest cost computing infrastructure for changing business priorities.


QUESTION 2
A Datum plans to implement the VDI.
You need to recommend a solution to ensure that the sales office users can access their corporate desktop from a company-owned iPad. The solution must be the most cost-effective solution today and must ensure that the company meets the licensing requirements of the planned IT strategy.
Which two licenses should you include in the recommendation? Each correct answer presents part of the solution.

A. A Windows Virtual Desktop Access (VDA) license for each tablet
B. A Windows Companion Subscription (CSL) license for each primary device
C. A Windows 8.1 Enterprise Upgrade license for each tablet
D. An RDS User CAL for each sales office user

Answer: A,D

Explanation: A: VDA licensing is the recommended license for VDI access devices that do not
qualify for SA. VDA provides organizations with the ability to license Windows for use via devices that do not traditionally come with a Windows license, such as thin clients, smartphones, and tablet devices. Organizations can also use VDA to license devices that the organization does not own, such as employees’ home PCs and contractor devices.
D: The RDS CAL is the primary license for Microsoft VDI. It offers the flexibility to deploy both VDI and RDS Session Virtualization so that you can provide access to full desktop and shared desktop experiences. You must purchase one RDS CAL for each device or user that accesses VDI. A
* Scenario: A Datum plans to implement a Virtual Desktop Infrastructure (VDI) by using Remote Desktop Services (RDS) on Windows Server 2012 R2.
In line with the VDI implementation, all of the sales office users will be issued a tablet. A Datum wants to enable the users to work from their home computer as well, as the need arises. In addition, the company plans to enable a Bring Your Own Device (BYOD) strategy.


QUESTION 3
Which two goals are met by the company's current licensing solution given the planned changes? Each correct answer presents part of the solution.

A. A Datum must run the most up-to-date versions of the desktop platform products to access the custom application.
B. A Datum wants the users to be able to access their corporate desktop from their home computer.
C. A Datum wants to deliver Windows and Office in a virtual desktop to the users.
D. A Datum wants to be able to install multiple virtual desktops on the device of each user.
E. A Datum wants the flexibility to deploy virtual desktops to the cloud.

Answer: B,C

Explanation: Not A: The latest versions can not be used. Not D, not E: No current cloud licensing exists.
* Scenario:
/ Current Licensing Solution
A Datum recently signed an Enterprise Agreement that includes Office Professional Plus, Windows Enterprise Upgrade, and Microsoft Core CAL Suite licensed per user.
Currently, all of the licenses for SQL Server are assigned to long-term workloads.
/ A Datum uses Microsoft Lync Server 2010, Microsoft SharePoint Server 2010, and Microsoft Exchange Server 2010. Various versions of Microsoft SQL Server are used heavily across the server farm both as an infrastructure product and as a data warehouse tool.
/ Business Goals
A Datum spent a significant amount of time developing a custom application that will be used by hundreds of the company's partners and suppliers. The application will always run on the latest version of SQL Server and SharePoint Server. A Datum wants the application
to be available to the users immediately.


QUESTION 4
A Datum purchases Windows 8.1 Enterprise Upgrade licenses through their current agreement.
What are three benefits of these licenses compared to the Original Equipment Manufacturer (OEM) licenses? Each correct answer presents a complete solution.

A. License Mobility rights
B. Rights to reassign licenses
C. Re-imaging rights
D. Perpetual usage rights
E. Windows Virtual Desktop Access (VDA) rights

Answer: B,D,E

Explanation: B: Windows Enterprise use rights are bound to the existing PC if SA is allowed to expire. And as before, Windows Enterprise edition upgrade licenses can be reassigned to a replacement device while SA is active, as long as the replacement device has a "qualifying OS."


QUESTION 5
A Datum is evaluating moving the licensing of its desktop platform products to Office 365.
Which three licenses will make up its desktop platform? Each correct answer presents part of the solution.

A. Office 365 ProPlus
B. Windows Intune
C. Windows 8.1 Enterprise
D. Microsoft Core CAL Suite Bridge for Office 365
E. Office 365 Enterprise E3

Answer: A,D,E

Explanation: A: When you deploy Office 365 ProPlus, it's installed on the user's local computer. Office 365 ProPlus is offered as a monthly subscription.
D: Microsoft Client Access License (CAL) Suite Bridges are used when you are transitioning from a CAL Suite (on premises) to a comparable Product and Online Service combination.
* Scenario:
A Datum wants to improve the manageability and control of the users' desktops. In the short term, the company will deploy Windows 8.1 Enterprise and Office Professional Plus 2013 internally. During the next six months, A Datum plans to implement a Virtual Desktop Infrastructure (VDI) by using Remote Desktop Services (RDS) on Windows Server 2012 R2.

Wednesday, 11 November 2015

Former Marine fights to connect veterans with IT jobs

One consulting firm's hiring program aims to place U.S. military veterans in IT engagements.
The transition to corporate life can be challenging for military veterans. Companies aren't used to hiring veterans, whose resumes are unlikely to make it past their keyword-filtering software. Veterans aren't used to articulating their military experience in business terms, nor are they accustomed to typical workplace culture and communication. Far too often, uniquely skilled veterans returning from Iraq and Afghanistan hear the same disheartening message -- that they’d make great security guards.

Nick Swaggert, a former infantry officer with the U.S. Marine Corps, sees untapped talent in these returning soldiers, and he’s committed to helping them find career opportunities in the tech world. Swaggert is Veterans Program Director at Genesis10, an outsourcing firm that provides IT consulting and talent management services. His job is to recruit veterans, help them translate their military experience to relevant corporate experience, and find a place for veterans to work at Genesis10's clients.

Swaggert knows firsthand what it’s like to see a military career reduced to the output of a military skills translator (software that’s designed to match military skills, experience and training to civilian career opportunities).

“I was in the Marine Corps infantry. Backpack and guns type of thing. So what does it say for me? I can be a security guard,” Swaggert says of the typical automated skills translator. “Someone in the infantry probably pulled a trigger less than 0.1% of the time. They probably spent a lot of their time in logistics, leadership, setting up communications assets, organizing supply chains. These are all things we did, but my job says I pulled a trigger.”

In reality, the infantry experience varies widely for today’s service men and women – including Swaggert, who was sent to the Syrian border, 300 miles from the nearest base. “I needed to make sure that the supply chain -- helicopters were flying us supplies -- was optimized. When you live in a space the size of a conference room table, or you're on a vehicle, there's not a lot of room for error in terms of too much or too little supplies,” he recalls. “I needed to learn how to set up a satellite radio, to send digital pictures of smugglers we were catching back to the base. Using a very high-tech radio and a rugged laptop in a sandstorm, I learned to problem-solve communications assets. That doesn't come across in a translator."

When Swaggert left the Marine Corps, he found a new mission: helping veterans find civilian jobs that make use of their myriad talents.

"I got out in 2010. I was told time and time again, 'Nick, you seem like a really great

guy, but you just don't have the experience that we're looking for.' That's what led me to go and get my master's degree and become passionate about it. This is a huge opportunity. There's a huge miss here in communication. Someone needs to be out there, proselytizing."
computerworld salary survey carousel hiring
Network jobs are hot; salaries expected to rise in 2016

Wireless network engineers, network admins, and network security pros can expect above-average pay

Why and how you should secure digital documents

The days when IT could autocratically dictate how employees access stored data and network traffic...
Genesis of an idea

Swaggert also understands what it’s like to be an enlisted person and an officer -- a rare perspective for veterans of the typically stratified U.S. military. He enlisted in the Marines right out of high school. He was later selected for an officer training program, which allowed him to get a college degree while in the Marines.

After getting his degree, Swaggert was commissioned as an officer in 2005. He wanted to be an infantry officer, even though a friend advised him to pursue a more hirable assignment in communications or logistics. “I said ‘no way, that's not going to happen. I'm going to go serve my country on the front lines.’ Then I came home, and like many other people, saw that doesn't help me.”

Even with a college degree, his path to a corporate career wasn't always smooth.
Swaggert applied and was rejected for a corporate program that’s designed to train and certify military veterans in computer networking. "My ASVAB -- Armed Services Vocational Aptitude Battery -- it's like the military SAT. It shows how well you can learn new jobs. I scored in the 96th percentile of all service members. They don't look at that, though. They just say, 'well, he was in the infantry, he can shoot guns. There's no way he could possibly learn network stuff.' This is exactly why people can't get jobs."

When young, college-educated officers leave the military, they’re often recruited through junior military officer (JMO) training programs at companies such as Deloitte, PwC, General Electric and PepsiCo. Companies compete to hire these service members, many of whom got their college degrees, served four years in the military, and are set to enter the business world at a young age having amassed significant leadership experience. “They have their degrees, the path is laid out for them, and they’re heavily recruited,” Swaggert says.

It’s a different world for enlisted men and women, most of whom leave the military without a college degree. Even if they get their degrees after serving in the military, it can be hard to find work. “An officer goes to college for four years, then serves for four years. An enlisted guy serves four years, then goes to college for four years. After eight years they're fairly equivalent, but one group is highly employed and the other group is heavily underemployed,” Swaggert says.

Nationwide, the unemployment rate for military veterans who served after 9/11 was 9% in 2013, according to data from the U.S. Bureau of Labor Statistics. That's down from 9.9% the year before, but well above the overall unemployment rate for civilians, which was 7.2% during the same period. The numbers are particularly bleak for the youngest veterans, aged 18-24, who posted a jobless rate of 21.4%.
c2 crew b

Nick Swaggert (center), pictured with the crew of his command and control vehicle during a break while patrolling the Syrian/Iraqi border.

“Being an officer, you gain a tremendous amount of experience and have tremendous leadership opportunities. The other group has been given similar, but not as extensive, experience. That's where we think there's a business opportunity,” Swaggert says.

At Genesis10, employees see the value of U.S. military experience in the corporate world. It’s a view that comes from the top. Harley Lippman is the CEO and owner of the $185 million privately-held firm, which is based in New York. Lippman participated in a program that brings groups of U.S. service-disabled veterans to Israel, and when he saw how well Israel treats its veterans – with comprehensive health services and job assistance, for example -- Lippman was inspired to launch his company’s program on Veterans Day in 2011. Swaggert joined the effort in mid-2013. “Harley is a visionary, and he saw that there's a huge opportunity to tap into this untapped talent vein,” Swaggert says.

The firm is realistic about placing former soldiers. Some of the roles Genesis10 envisions U.S. military veterans helping fill include project manager, business analyst, testing analyst, storage administrators, database administrators, network engineers, midrange server specialists, and problem and incident management positions.

“We have clients who need Java developers with 10 years of experience. I'm not pretending Joe Smith off the street is going to do that,” Swaggert says. “But there are needs such as entry-level data entry, business analyst, quality assurance -- stuff veterans will do really well, very process-oriented roles. Veterans are very detail-oriented. We have checklists for everything we do. If you don't dot an 'i' or cross a 't' an artillery round lands on your location.”

Part of Genesis10’s strategy is to connect veterans with companies that want to hire returning soldiers but are unsure how to go about it.

One hurdle is that many companies don’t know how to find veterans. It’s not enough to post typical job descriptions on veteran-focused job boards or at military recruiting fairs. "That doesn't mean anything to a veteran. You're not recruiting by job code -- everyone in the military has a job code. You're not recruiting by rank -- rank equals experience," Swaggert says. “You have to tailor that.”

He’s understanding of the conundrum for hiring managers. "On the company side, I don't blame them,” Swaggert says. “Hiring managers don't have experience hiring veterans. We are such a small fraction of the population. You can't expect them to know and understand.”

Another part of Genesis10’s strategy is to prepare veterans for workplace culture, not only by tweaking resumes but also through interview coaching and soft-skills development. Communication is a key element.

"Veterans have different communications styles. In the military, we call it BLUF -- it's an acronym that stands for 'bottom line up front.' You state the bottom line. In the military, you walk up to someone at their desk, or wherever, and you just tell them what you want,” Swaggert says. Civilians communicate differently, and veterans need to learn to deal with the differences.

Veterans also need to learn how to interview. In the military, higher-ups look at soldiers’ service records to determine who moves up the ranks. “That interviewing skill just completely atrophies -- if it was ever there in the first place and most likely it wasn't,” Swaggert says.

For companies that are open to hiring veterans, Genesis10 can smooth the process. The company understands that there’s risk associated with trying new hiring approaches. "We've built a program to try to mitigate that risk,” Swaggert says. "We flat out say in our presentation, 'we are here to mitigate the risk of hiring a veteran.'"

Still, it’s not always an easy sell. "There's a reason why veterans don't get hired. If it were easy it would already have been done. You have to invest time and effort. I wish I could say it's just rewriting a resume. But it's not.”

The most challenging part of Swaggert’s job is trying to find companies that are willing to hire veterans.

“My number one job is not to find veterans. I could stroll down to the nearest base, or post a job online looking for U.S. Military veterans. The hard part is walking into the companies. I've talked to a lot of CIOs, a lot of VPs, saying, 'do you guys want to hire veterans?' They all say yes, and they say, ‘well how do we do it?’ We talk about selection, training, mentoring, and onboarding and getting them to commit to that kind of investment.”

Success is hearing “’yes, I'm going to force my people to hire someone who's a little bit different.’”

Swaggert joined the Reserves to stay connected to the military, and as a commanding officer in the Reserves, he flies monthly to Ohio. “The Marine Corps is very important to me. It will always be very important to me,” Swaggert says. “I'm not wearing a uniform every day, but I’m definitely doing military-related things daily.”

“There are plenty of people like me, who joined the military during a time of war, who are really smart people who said, 'I want to serve on the front lines, because that's what this country needs.'"

Now that they’re home, he wants to help them find work.



Sunday, 1 November 2015

Sony BMG Rootkit Scandal: 10 Years Later

Object lessons from infamous 2005 Sony BMG rootkit security/privacy incident are many -- and Sony's still paying a price for its ham-handed DRM overreach today.

Hackers really have had their way with Sony over the past year, taking down its Playstation Network last Christmas Day and creating an international incident by exposing confidential data from Sony Pictures Entertainment in response to The Interview comedy about a planned assassination on North Korea’s leader. Some say all this is karmic payback for what’s become known as a seminal moment in malware history: Sony BMG sneaking rootkits into music CDs 10 years ago in the name of digital rights management.

“In a sense, it was the first thing Sony did that made hackers love to hate them,” says Bruce Schneier, CTO for incident response platform provider Resilient Systems in Cambridge, Mass.
LogRhythm CEO hobbies

Mikko Hypponen, chief research officer at F-Secure, the Helsinki-based security company that was an early critic of Sony’s actions, adds:

“Because of stunts like the music rootkit and suing Playstation jailbreakers and emulator makers, Sony is an easy company to hate for many. I guess one lesson here is that you really don't want to make yourself a target.

“When protecting its own data, copyrights, money, margins and power, Sony does a great job. Customer data? Not so great,” says Hypponen, whose company tried to get Sony BMG to address the rootkit problem before word of the invasive software went public. “So, better safe than Sony.”

The Sony BMG scandal unfolded in late 2005 after the company (now Sony Music Entertainment) secretly installed Extended Copy Protection (XCP) and MediaMax CD-3 software on millions of music discs to keep buyers from burning copies of the CDs via their computers and to inform Sony BMG about what these customers were up to. The software, which proved undetectable by anti-virus and anti-spyware programs, opened the door for other malware to infiltrate Windows PCs unseen as well. (As if the buyers of CDs featuring music from the likes of Celine Dion and Ricky Martin weren’t already being punished enough.)

The Sony rootkit became something of a cultural phenomenon. It wound up as a punch line in comic strips like Fox Trot, it became a custom T-shirt logo and even was the subject of class skits shared on YouTube. Mac fanboys and fangirls smirked on the sidelines.

“In a sense, [the rootkit] was the first thing Sony did that made hackers love to hate them,” says Bruce Schneier, Resilient Systems CTO.

Security researcher Dan Kaminsky estimated that the Sony rootkit made its mark on hundreds of thousands of networks in dozens of countries – so this wasn’t just a consumer issue, but an enterprise network one as well.

Once Winternals security researcher Mark Russinovich -- who has risen to CTO for Microsoft Azure after Microsoft snapped up Winternals in 2006 -- exposed the rootkit on Halloween of 2005, all hell broke loose.

Sony BMG botched its initial response: "Most people don't even know what a rootkit

is, so why should they care about it?" went the infamous quote from Thomas Hesse, then president of Sony BMG's Global Digital Business. The company recalled products, issued and re-issued rootkit removal tools, and settled lawsuits with a number of states, the Federal Trade Commission and the Electronic Frontier Foundation.

Microsoft and security vendors were also chastised for their relative silence and slow response regarding the rootkit and malware threat. In later years, debate emerged over how the term “rootkit” should be defined, and whether intent to maliciously seize control of a user’s system should be at the heart of it.

In looking back at the incident now, the question arises about how such a privacy and security affront would be handled these days by everyone from the government to customers to vendors.

“In theory, the Federal Trade Commission would have more authority to go after [Sony BMG] since the FTC’s use of its section 5 power has been upheld by the courts,” says Scott Bradner, University Technology Security Officer at Harvard. “The FTC could easily see the installation of an undisclosed rootlet as fitting its definition of unfair competitive practices.”

Bill Bonney, principal consulting analyst with new research and consulting firm TechVision Research, says he can’t speak to how the law might protect consumers from a modern day Sony BMG rootkit, but “with the backlash we have seen for all types of non-transparent ways (spying, exploiting, etc.) companies are dealing with their customers, I think in the court of public opinion the response could be pretty substantial and, as happened recently with the EU acting (theoretically) because of [the NSA’s PRISM program], if the issue is egregious enough there could be legal or regulatory consequences. “

As for how customers might react today, we’ve all seen how quickly people turn to social media to take companies to task for any product or service shortcoming or any business shenanigans. Look no further than Lenovo, which earlier this year got a strong dose of negative customer reaction when it admittedly screwed up by pre-loading Superfish crapware onto laptops. That software injected product recommendations into search results and opened a serious security hole by interfering with SSL-encrypted Web traffic.

In terms of how security vendors now fare at spotting malware or other unsavory software, Schneier says “There’s always been that tension, even now with stuff the NSA and FBI does, about how this stuff is classified. I think [the vendors] are getting better, but they’re still not perfect… It’s hard to know what they still let by.”

Noted tech activist Cory Doctorow, writing for Boing Boing earlier this month, explains that some vendors had their reasons for not exposing the Sony rootkit right away. “Russinovich was not the first researcher to discover the Sony Rootkit, just the first researcher to blow the whistle on it. The other researchers were advised by their lawyers that any report on the rootkit would violate section 1201 of the DMCA, a 1998 law that prohibits removing ‘copyright protection’ software. The gap between discovery and reporting gave the infection a long time to spread.”

Reasons for hope though include recent revelations by the likes of Malwarebytes, which warned users that a malicious variety of adware dubbed eFast was hijacking the Chrome browser and replacing it, by becoming the default browser associated with common file types like jpeg and html.

Schneier says it’s important that some of the more prominent security and anti-virus companies -- from Kaspersky in Russia to F-Secure in Finland to Symantec in the United States to Panda Security in Spain -- are spread across the globe given that shady software practices such as the spread of rootkits are now often the work of governments.

“You have enough government diversity that if you have one company deliberately not finding something, then others will,” says Schneier, who wrote eloquently about the Sony BMG affair for Wired.com back in 2005.

The non-profit Free Software Foundation Europe (FSFE) has been calling attention to the Sony BMG rootkit’s 10th anniversary, urging the masses to “Make some noise and write about this fiasco” involving DRM. The FSFE, seeing DRM as an anti-competitive practice, refers to the words behind the acronym as digital restriction management rather than the more common digital rights management.

F-Secure Chief Research Officer Mikko Hypponen: "I guess one lesson here is that you really don't want to make yourself a target."

Even worse, as the recent scandal involving VW’s emissions test circumvention software shows, is that businesses are still using secret software to their advantage without necessarily caring about the broader implications.

The object lessons from the Sony BMG scandal are many, and might be of interest to those arguing to build encryption backdoors into products for legitimate purposes but that might be turned into exploitable vulnerabilities.

One basic lesson is that you shouldn’t mimic the bad behavior that you’re ostensibly standing against, as Sony BMG did “in at least appearing to violate the licensing terms of the PC manufacturers” TechVision’s Bonney says.

And yes, there is a warning from the Sony BMG episode “not to weaponize your own products. You are inviting a response,” he says.



Wednesday, 28 October 2015

Exam 70-355 Universal Windows Platform – App Data, Services, and Coding Patterns (beta)

Exam 70-355
Universal Windows Platform – App Data, Services, and Coding Patterns (beta)

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.


Recognize and apply a specified design pattern
Describe the relationship between architecture, design pattern, and framework
Recognize common architectures and when they should be used, recognize common design patterns and when a pattern can be applied to make programming tasks faster and easier

Describe traditional Microsoft .NET design patterns
Describe the Gang of Four design patterns, including creational patterns, structural patterns, and behavioral patterns; describe 3-tier/N-tier patterns; describe enterprise patterns; describe cloud design patterns; describe head first patterns; describe repository patterns; describe unit of work patterns

Apply the Model-View-ViewModel (MVVM) Prism pattern
Separate concerns, develop the views for the MVVM app, develop the view-models for the MVVM app, develop the models for the MVVM app, develop class interactions and data binding for the MVVM app

Develop app and business logic, code that interfaces with other line-of-business (LOB) apps, and LOB Server Services (AD, SP)

Develop code for app-specific processes and computations
Create an asynchronous method or process, managing the return value from an asynchronous method, debugging and error handling for an asynchronous method, develop storyboards and custom animations for an object, represent 3-D models as code objects, manage 2-D projections of 3-D objects, use Task, ThreadPool, and background transfers

Implement background tasks
Create a background task, register a background task, set and respond to triggers, debug a background task, implement a lock screen app, share data/events between an app and its background tasks; directly calling a background task

Manage app lifecycle events
Prepare for suspension, resume from suspension or termination, implement an extended splash screen, extend execution and monitor suspension errors

Implement interactions with other apps
Integrate a share contract to share content with another app, integrate contact and appointment functionality, implement mapping and navigation (geolocation, geofencing, and Bing Maps), exchange data/file between apps, including launch for result; use drag and drop

Implement notifications and Windows Push Notification Services (WNS)
Implement and manage notifications; support Live Tile updates, including toasts and badges, support Action Center and secondary tiles

Implement interactions with devices
Develop code for camera and microphone, including photo, video, and audio; implement screen capture; implement printing and Play To; integrate HoloLens sensors and services; support

wireless communication
Develop class libraries (code libraries, DLLs)
Naming assemblies, namespaces, types, and members in class libraries; using static and abstract classes, interfaces, enumerations, structures, and other types; designing and using properties, methods, constructors, fields, events, operators, and parameters; implementing extensibility mechanisms such as subclassing, using events, virtual members, and callbacks; designing, throwing, and catching exceptions

Develop code for implementing secure cloud data services and storage

Design and implement data roaming
Roaming user settings and preferences, roaming app session info
Design and implement a RESTful data solution (oData, JSON)
Using the ASP.NET Web API, implementing JSON serialization, adding a service reference to the project, using Windows.Web.Http.HttpClient objects
Design and implement Azure and cloud data sources
Implement offline data sync, implement caching, support OneDrive integration, implement file access and management (including File Picker and file access APIs), upload images to Azure Storage

Integrate Azure data services
Call a custom Cloud Service API from a client, schedule backend jobs in Mobile Services
Design and implement removable and embedded local data sources
Support SD card storage, implement SQLite on mobile devices

Develop code to implement authentication and business security requirements
Implement code to manage authentication and identity
Web authentication broker; Azure authentication; implement code to manage identity; implement biometric identity verification, including Windows Hello; implement Credential Locker, implement single sign-on
Implement code to manage authorization and access to resources
Implement authentication requests; authorize users and apps; manage authorization IDs; restrict access to resources, including data, files, folders, and devices
Implement cryptography within an app
Create cryptographic keys, hash and sign content, create message authentication codes, encrypt and decrypt data
Support enterprise security considerations
Implement security transparency, implement code access security, implement role-based security

Integrate cloud services and Azure App Service services
Build native and cross-platform apps using services
Integrate Azure App Service mobile app functionality within an existing mobile app, use a .NET client with Mobile Services, call a custom API from a client
Connect to your enterprise systems using services
Build a service that uses an existing SQL database, connect to an on-premises SQL Server from an Azure mobile service using hybrid connections, scale mobile services backed by Azure SQL database, authenticate your app with Active Directory Authentication Library single sign-on, add role-based access control to mobile services with Azure Active Directory, access Microsoft SharePoint on behalf of the user, schedule backend jobs in mobile services, troubleshoot a mobile services .NET backend
Connect to SaaS APIs using services
Implement single sign-on using credentials from third-party identity providers, build a service that uses MongoDB as a data store
Build offline-ready apps with sync using services
Allow employees to work offline when connectivity is not available, synchronize with your enterprise backend systems when devices comes back online, recover in the event of a disaster
Push notifications to users using services
Add push notifications to your app, send push notifications to authenticated users

Develop code that is maintainable and that supports app versioning, compatibility, and coexistence
Develop code using version control (TFVC or Git)
Develop code using a standardized coding convention, implement best practices for assembly versioning
Implement best practices for assemblies and side-by-side execution
Use strong-named assemblies, including version, culture, and publisher; use the GAC to provide version-aware storage; create an app that runs in isolation
Implement best practices for assembly placement and the GAC
Using an app configuration file, using codebases, providing a binding context


Wednesday, 21 October 2015

10 key moments in the history of Apple and Microsoft

Apple and Microsoft recently renewed their alliance with the goal of tackling the enterprise market, but the latest partnership is just the most recent turning point in the two companies' intertwined histories. Here are the defining moments that led up to the new pact.

Apple and Microsoft's history of highs and lows
Apple and Microsoft share a common history and bond in the evolution of personal computing. Relations between the two technology pioneers were generally cordial when they were founded in the 1970s, but that sense of mutual respect quickly turned to discord. The founders of both companies were at loggerheads often in the past. Today their new leaders appear determined to bury the hatchet and partner for greater opportunities in the enterprise.

Youthful innocence of the early '80s
Microsoft was a critical Apple ally during the first Macintosh's development. At an Apple event in 1983, Microsoft CEO Bill Gates told attendees Microsoft expected to earn half of its revenues selling Macintosh software the following year. He called the Macintosh, "something that's really new and really captures people's attention."

Jobs ousted from Apple, forms NeXT
In 1985, Apple CEO Steve Jobs was ousted from the company he cofounded nine years earlier. He immediately sold all but one share in Apple to fund the launch of NeXT, where he would spend the next 12 years building computer workstations for higher education and business.

Jobs says Microsoft has 'no taste'
"The only problem with Microsoft is they just have no taste," Jobs said in the 1996 "Triumph of the Nerds" TV documentary. "They have absolutely no taste. And I don't mean that in a small way, I mean that in a big way, in the sense that they don't think of original ideas, and they don't bring much culture into their products."

Jobs returns to Apple, partners with Microsoft
When Apple acquired NeXT in 1997 and brought Steve Jobs back into the fold, the company was in disarray amid growing uncertainty about the future of Microsoft Office for Mac. During his keynote address at the Macworld Expo that year, Jobs extolled the virtues of partnering with industry leaders and spoke of the need to improve Apple's partner relations.

Gates addresses the Apple faithful in 1997
"Microsoft is going to be part of the game with us as we restore this company back to health," Jobs said at Macworld, before asking Gates to address the crowd via satellite.

"We think Apple makes a huge contribution to the computer industry," Gates said. "We think it's going to be a lot of fun helping out."

Gates and Jobs take the stage together in 2007
A seminal moment occurred between the leaders of both companies when Gates and Jobs jointly took the stage for an interview at the D5 conference. Both men praised each other in their own ways. Jobs commended Gates for building the first software company in the world, but Gates was more flattering. "What Steve's done is quite phenomenal," he said.

'Memories longer than the road ahead'
When Jobs was asked to describe the greatest misunderstanding of his relationship with Gates, he said: "I think of most things in life as either a Bob Dylan or a Beatles song, but there's that one line in that one Beatles song — 'You and I have memories longer than the road that stretches out ahead' — and that's clearly very true here."

Apple invites Microsoft exec on stage for iPad demo
A new era of partnership buoyed by opportunities in the enterprise began to blossom in the early-2010s. At Apple's September 2015 new product event in San Francisco, the company invited Kirk Koenigsbauer, vice president of Microsoft Office, on stage to demonstrate Office 365 apps working in split-screen mode on an iPad Pro.

Microsoft CEO uses iPhone at Dreamforce
At Salesforce's 2015 Dreamforce conference, Microsoft CEO Satya Nadella demoed the company's iOS apps on an iPhone. When Nadella did the once unthinkable, using an iPhone on stage, he acknowledged it as such but also made clear that it wasn't his phone. "It is a pretty unique iPhone," he said. "I like to call it the iPhone Pro because it has all the Microsoft software and applications on it … It's pretty amazing."

Apple CEO Tim Cook doesn't hold a grudge
During a keynote at cloud-storage company Box's BoxWorks conference in September 2015, when asked about the company's renewed relationship with Microsoft, Apple CEO Tim Cook said he doesn't believe in holding grudges. "If you think back in time, Apple and IBM were foes. Apple and Microsoft were foes," Cook said. "Apple and Microsoft still compete today, but frankly Apple and Microsoft can partner on more things than we could compete on, and that's what the customer wants."


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Sunday, 11 October 2015

5 ways to shore up security in your BYOD strategy

You’d think after all this time that organizations would have finally gotten BYOD programs pretty much down pat. Don’t bet on it.

A recent study by tyntec reveals that a vast majority of organizations still have inadequate bring-your-own-device (BYOD) policies. That’s not very encouraging, considering that 49 percent of workers now use a personal mobile device for work-related tasks and spend a great deal of time on personal devices for their job.

Further, the typical U.S. worker now expects to have nothing less than total access – anywhere, anytime, from any device – to their employer’s networks, finds another study from Dell and Intel. But despite all this demand on the user side, many organizations still wrestle with security, privacy and support issues around BYOD. That is holding many employers back when it comes to giving BYOD an enthusiastic ‘thumbs up’.

So what does it take to get BYOD right in 2015? CSO put that question to a few IT leaders, whose collective responses reflect the still wide divide on how BYOD is supported at the IT executive level, possibly depending on the industry in which they work.

An undeniable force

The higher education sector has embraced BYOD probably as much as any. No surprise here, really. College and university culture is all about openness – of ideas, of expression, and of access to resources. So it is only natural that today’s campus environment is awash with personal devices.

The University of Tennessee at Chattanooga is a prime example. According to Thomas Hoover, associate vice chancellor and CIO, and Susan Lazenby, manager of strategic planning and communication, BYOD has taken the campus by storm.

The two shared the school’s experiences with BYOD by stressing the impact it has had on the school’s IT organization, including staff and budget. But they confirmed that BYOD was a trend not to be denied, and the university had no choice but to adopt it. They also noted that a robust BYOD program is not just demanded by students, but also by faculty and employees.

To illustrate how rapidly BYOD caught on at UT, the two noted that five years ago the school’s network was supporting 809 devices. That number rose to 14,906 in 2014. This year it jumped to approximately 48,000.
It’s a similar tale hundreds of miles away at Worcester State University in Massachusetts.
“Like any other institute in higher education, Worcester State doesn’t have any choice but to support BYOD,” notes Anthony (Tony) Adade, CIO at the university. “The students come from diverse backgrounds. They come with all kinds of devices. For several years we’ve been seeing an influx of games on our campus – all kinds of games. Besides the normal devices that we have to deal with, we didn’t have any choice but to support them.”

Like at the University of Tennessee, wide-scale BYOD has been a fairly new phenomenon at Worcester State, but demand quickly made up for lost time.

“Initially it was limited. The network itself was at capacity and was not able to handle the devices coming on campus,” Adade explains. “We had to tell some students that they can’t bring devices on campus or if they did they were on their own. However, later on we realized it would be in our strategic interest to have a plan and to address the issue. Now we can safely accommodate almost every device. “

Colleges and universities aren’t the only organizations that have felt compelled to adopt BYOD programs, of course. Countless companies and nonprofits are also supporting programs, and have learned some important lessons in how to do it right.

“It is important to have technology in-house to support BYOD strategy,” notes Christine Vanderpool, CIO at Molson Coors, one of the nation’s leading brewers. “Companies should invest in tools like MDM, DLP and application monitoring (tools that inform the user of malicious applications on their devices). You need staff to support these tools. You need a strong set of policies, procedures and end user education.”

“It is good to focus on the ‘what’s in it for them’ in most cases,” Vanderpool stresses. “If you deploy MD or application controls, you have to explain how this is protecting them in their daily life and not just in their work life.”

What are the most important elements of an effective BYOD program in terms of both providing employee flexibility and productivity and also ensuring company data and network security? Molson Coors CIO Christine Vanderpool offers the following tips on what should be considered: Identified risks include:

“Give real life examples like how some malicious apps can take control/read all the user’s SMS text messages, see password information entered into a bank app, etc. People care most when they can understand it and can potentially impact their lives beyond just their job,” Vanderpool says.

Not everyone’s a believer

But many CIOs remain skeptics when it comes to supporting BYOD, fearing that the probable risks still outweigh the possible benefits. One of them is Jim Motes, vice president and CIO at Rockwell Automation.

“I'm not really a fan of BYOD phones,” Motes says. “I believe the privacy constraints will be at odds with protecting and controlling corporate intellectual property.”

“The smartphone is not just communication technology, it's a social lifeline, diary, and entertainment system,” Motes continues. “People have too much personal information stored on these systems and should be very careful about how much access they want to give their employers. Employers should avoid them completely to limit their liability should that personal information be breached and exposed.”

So how does an organization resolve these two competing forces: security and privacy concerns on one hand, versus user demand for convenience on the other?

Our sources offered the following combined tips on how to get BYOD right:

Have a thoughtful strategy
As noted, security remains a top concern for IT leaders when it comes to BYOD. It is therefore important to involve the IT security team in establishing a program from the outset. But the approach should be for the CSO to help find a solution, not reasons to not support it. The focus should be on how to best secure the data first and foremost, then the devices.

Take stock of the situation
Once you’ve set your strategy, begin with assessments of the network capacity and the security status. Issues to consider include how much vulnerability does the network have? Who is connecting to it? What devices and applications are they using?

Have a clear set of policies and expectations
You need a set policy of guidelines on what is allowed and what is not and to guide behavior of employees and users. Policies should be simple and easy to understand. Toward that end, have your employees help draft the policies to get their understanding and support up-front.

Some devices are a ‘go’ and some are a ‘no’
Third, identify the devices you wouldn’t be able to support. The program probably can’t be all things to all employees. Create an approved list of devices that IT will support, providing the employee has a valid business reason for using it. Purchase the devices at a reduced cost for employees, and put necessary safeguards on those devices. Let employees know up front to what degree you will support a particular device purchase.

Proper training is critical
Educate employees on how to connect their devices to the network and also the dos and don’ts of their usage. Lunchtime training sessions are a smart idea. Stress what it is that employees are agreeing to, including what happens if a device is lost or stolen – the wiping of the device. Most employees will say yes, and for those that don’t, they can’t participate in the program.

Finally, “BYOD risks and considerations will continue to grow and change just as rapidly as the technologies change,” stresses Vanderpool. “It is vital that all aspects of the BYOD model be continuously reviewed, updated, re-communicated and employees re-educated. The model deployed and the supporting guidelines, policies and procedures implemented to support it must be agile and allow the company to be able to quickly adapt or change them when necessary.”


Tuesday, 29 September 2015

Five steps to optimize your firewall configuration

95% of all firewall breaches are caused by misconfiguration. Here's how to address the core problems

Firewalls are an essential part of network security, yet Gartner says 95% of all firewall breaches are caused by misconfiguration. In my work I come across many firewall configuration mistakes, most of which are easily avoidable. Here are five simple steps that can help you optimize your settings:

* Set specific policy configurations with minimum privilege. Firewalls are often installed with broad filtering policies, allowing traffic from any source to any destination. This is because the Network Operations team doesn’t know exactly what is needed so start with this broad rule and then work backwards. However, the reality is that, due to time pressures or simply not regarding it as a priority, they never get round to defining the firewall policies, leaving your network in this perpetually exposed state.

You should follow the principle of least privilege – that is, give the minimum level of privilege the user or service needs to function normally, thereby limiting the potential damage caused by a breach. You should also document properly – ideally mapping out the flows that your applications actually require before granting access. It’s also a good idea to regularly revisit your firewall policies to look at application usage trends and identify new applications being used on the network and what connectivity they actually require.

* Only run required services. All too often I find companies running firewall services that they either don’t need or are no longer used, such as dynamic routing, which typically should not be enabled on security devices as best practice, and “rogue” DHCP servers on the network distributing IPs, which can potentially lead to availability issues as a result of IP conflicts. It’s also surprising to see the number of devices that are still managed using unencrypted protocols like Telnet, despite the protocol being over 30 years old.

The solution is to harden devices and ensure that configurations are compliant before devices are promoted into production environments. This is something a lot of organizations struggle with. By configuring your devices based on the function that you actually want them to fulfil and following the principle of least privileged access – before deployment – you will improve security and reduce the chances of accidentally leaving a risky service running on your firewall.

* Standardize authentication mechanisms. During my work, I often find organizations that use routers that don’t follow the enterprise standard for authentication. One example I encountered is a large bank that had all the devices in its primary data centers controlled by a central authentication mechanism, but did not use the same mechanism at its remote office. By not enforcing corporate authentication standards, staff in the remote branch could access local accounts with weak passwords, and had a different limit on login failures before account lockout.

This scenario reduces security and creates more opportunities for attackers, as it’s easier for them to access the corporate network via the remote office. Enterprises should therefore ensure that any remote offices they have follow the same central authentication mechanism as the rest of the company.

* Use the right security controls for test data. Organizations tend to have good governance stating that test systems should not connect to production systems and collect production data, but this is often not enforced because the people who are working in testing see production data as the most accurate way to test. However, when you allow test systems to collect data from production, you’re likely to be bringing that data down into an environment with a lower level of security. That data could be highly sensitive, and it could also be subject to regulatory compliance. So if you do use production data in a test environment, make sure that you use the correct security controls required by the classification the data falls into.

* Always log security outputs. While logging properly can be expensive, the costs of being breached or not being able to trace the attack are far higher. Failing to store the log output from their security devices, or not doing so with enough granularity is one of the worst things you can do in terms of network security; not only will you not be alerted when you’re under attack, but you’ll have little or no traceability when you’re carrying out your post-breach investigation. By ensuring that all outputs from security devices are logged correctly organizations will not only save time and money further down the line but will also enhance security by being able to properly monitor what is happening on their networks.

Enterprises need to continuously monitor the state of their firewall security, but by following these simple steps businesses can avoid some of the core misconfigurations and improve their overall security posture.


Sunday, 13 September 2015

Get ready to live in a trillion-device world

A swarm of sensors will let us control our environment with words or even thoughts

In just 10 years, we may live in a world where there are sensors in the walls of our houses, in our clothes and even in our brains.

Forget thinking about the Internet of Things where your coffee maker and refrigerator are connected. By 2025, we could very well live in a trillion-device world.
[ Stay up to date on tech news with Computerworld's daily newsletters. ]

That's the prediction from Alberto Sangiovanni-Vincentelli, a professor of electrical engineering and computer science at the University of California at Berkeley.

"Smartness can be embedded everywhere," said Sangiovanni-Vincentelli. "The entire environment is going to be full of sensors of all kinds. Chemical sensors, cameras and microphones of all types and shapes. Sensors will check the quality of the air and temperatures. Microphones around your environment will listen to you giving commands."

This is going to be a world where connected devices and sensors are all around us -- even inside us, Sangiovanni-Vincentelli said in an interview with Computerworld during DARPA's Wait, what? Forum on future technology in St. Louis this week.

"It's actually exciting," he said. "In the next 10 years, it's going to be tremendous."

According to the Berkeley professor and researcher, we won't have just smartphones.

We'll have a swarm of sensors that are intelligent and interconnected.

Most everything in our environment -- from clothing to furniture and our very homes -- could be smart. Sensors could be mixed with paint and spread onto our walls.

We'll just speak out loud and information will instantly be given to us without having to do an online search, phone calls can be made or a robot could start to clean or make dinner.

And with sensors implanted in our brains , we wouldn't even need to speak out loud to interact with our smart environment.

Want something? Just think about it.

"The brain-machine interface will have sensors placed in our brains, collecting information about what we think and transmitting it to this complex world that is surrounding us," said Sangiovanni-Vincentelli. "I think I'd like to have an espresso and then here comes a nice little robot with a steaming espresso because I thought about it."

Pam Melroy, deputy director of DARPA's Tactical Technology Office, said the Berkeley professor isn't just dreaming.

"I do think there's something to that" scenario, said Melroy, who is a retired U.S. Air Force officer and former NASA astronaut. "At the very least, we should be preparing for it and thinking of what is needed. We get into very bad places when technology outstrips our planning and thinking. I'd rather worry about that and prepare for it even if it takes 20 years to come true, than just letting it evolve in a messy way."

While having a trillion-device life could happen in as little as 10 years, Sangiovanni-Vincentelli said there's a lot of work to be done to get there.

First, we simply don't have the network we'd need to support this many connected devices.

We would need communication protocols that consume very small amounts of energy and can transmit fluctuating amounts of information, the professor explained. Businesses would need to build massive numbers of tiny, inexpensive sensors. We'll need more and better security to fend off hacks to our clothing, walls and brains.

And the cloud will have to be grown out to handle all of the data that these trillion devices will create.

"Once you have the technology enabling all of this, we should be there in 10 years," said Sangiovanni-Vincentelli.

With all of these devices, many people will be anxious about what this means for personal privacy.

Sangiovanni-Vincentelli won't be one of them, though.

"Lack of privacy is not an issue," he said. "We've already lost it all... If the government wants me now, they have me. Everything is already recorded somewhere. What else is there to lose?"

Melroy also is more excited than nervous about this increasingly digital future.

"As a technologist, I don't fear technology," she said. "I think having ways that make us healthier and more efficient are a good thing... There is social evolution that happens with technological evolution. We once were worried about the camera and the privacy implications of taking pictures of people. The challenge is to make the pace of change match the social evolution."


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Monday, 31 August 2015

The 15 biggest enterprise ‘unicorns’

The Wall Street Journal found 115 companies valued at more than $1 billion, these are the 15 biggest enterprise tech ones

Yester-year there were only a few unicorns in the world of startups.

This week though, the Wall Street Journal and Dow Jones VenturSource identified 115 companies with valuations north of $1 billion, which are referred to as unicorns.

Below are 15 of the highest valued enterprise software companies that have received venture funding but have not yet been sold or gone public.

Palantir
Valuation: $20 billion
Funding: $1.5 billion

What it does: Palantir has created a program that’s really good at finding relationships across vast amounts of data, otherwise known as link analysis software. Its meteoric rise has been fueled by big-money contracts with federal government agencies. Palantir is the second-largest unicorn, behind Uber, that The Wall Street Journal identified.

Dropbox
Valuation: $10 billion
Funding: $607 million

What it does: One of the pioneers of the cloud market, Dropbox’s file synch and share system has been a hit with consumers, and increasingly with businesses too. Chief competitor Box would have been a unicorn, but the company went public this year.

Zenefits
Valuation: $4.5 billion
Total funding: $596 million

What it does: Zenefits provides a cloud-based human resource management (HRM) system for small and midsized business, with an emphasis on helping businesses manage health insurance administration and costs.

Cloudera
Valuation: $4.1 billion
Total funding: $670 million

What it does: Cloudera provides a distribution of Hadoop. It’s chief competitor in the big data/Hadoop market, Hortonworks, filed for an initial public offering earlier this year after being a unicorn itself.
Resources

Pure Storage
Valuation: $3 billion
Funding: $530 million

What it does: Pure storage is one of the most popular startups in the solid-state, flash-storage market. It pitches its hardware-software product as a more affordable competitor to storage giant EMC.

Docusign
Valuation: $3 billion
Funding: $515 million

What it does: Docusign lets users electronically sign and file paperwork.

Slack
Valuation: $2.8 billion
Funding: $315 million

What it does: Slack is an enterprise communication and collaboration platform, allowing users to text and video chat, plus share documents too.

Nutanix
Valuation: $2 billion
Funding: $312 million

What it does: Nutanix is one of the startups in the hyperconvernged infrastructure market, providing customers an all-in-one system that includes virtualized compute, network and storage hardware, controlled by a custom software. Converged systems are seen as the building blocks of distributed systems because of their ability to optimize performance, particularly on the storage side.

Domo
Valuation: $2 billion
Funding: $459 million

What it does: Founded by Josh James (who sold his previous startup Omniture to Adobe for $1.8 billion), this Utah-based company provides business intelligence software hosted in the cloud tailored for business executives. The idea is to provide c-level executives at companies ready access to important data they need to run their companies in a user-friendly format accessible on any device.

GitHub
Valuation: $2 billion
Funding: $350 million

What it does: GitHub is a platform for storing software that makes up open source projects. These repositories can be public or private and allow users to track bugs, usage and downloads. If you use an open source project, it’s likely hosted on GitHub.

Tanium
Valuation: $1.8 billion
Funding: $142 million

What it does: Tanium is a platform for identifying and remedying application outages or security threats in real-time. One of it biggest differentiating features is an intuitive search bar that allows users to quickly search in natural language to check the status of the system they’re monitoring for a variety of issues.

MongoDB
Valuation: $1.6 billion
Funding: $311 million

What it does: MongoDB is one of the most popular NoSQL databases. These new breeds of databases are ideal for managing unstructured data, like social media streams, documents and other complex data that don’t fit well into traditional structured databases.

InsideSales.com
Valuation: $1.5 billion
Funding: $199 million

What it does: InsideSales.com is a big data platform that analyzes business relationships with customers and provides predictive analytics for future sales strategy.

Mulesoft
Valuation: $1.5 billion
Funding: $259 million

What it does: Mulesoft is the commercial product for the open source Mule software, an enterprise service bus that helps integrate and coordinate data across applications. Having a common data set that multiple applications can use reduces duplication and cost.

Jasper Technologies
Valuation: 1.4 billion
Funding: $204 million

What it does: Jasper Technologies creates a platform for the budding Internet of Things. The company’s software allows data generated by machines to be stored and analyzed in the company’s software.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Wednesday, 19 August 2015

How to uncover the Dark Web

Cybercriminals love the Dark Web because it is almost impossible to track or identify them.

One of the best ways to understand your enemy – what he’s up to, what his capabilities are and how he can damage you – is to spy on him.

And according to some cybercrime experts, one of the easier and more effective ways to do that is to hang out where the bad guys do – on the Dark Web.
security tools 1

In a recent post on Dark Reading, Jason Polancich, founder and chief architect of SurfWatch Labs, asserted that, “most businesses already have all the tools on hand for starting a low-cost, high-return Dark Web intelligence operations within their own existing IT and cybersecurity teams.”

Such a data mining operation, he wrote, could be up and running in a day.

It is widely known in IT circles that the Dark Web is a thriving cybercrime marketplace offering multiple exploits, hacking for hire, stolen personal data and intellectual property, spam and phishing campaigns, insider threats for hire and more.

It is also a relatively secure place for criminals to operate, thanks to randomness, anonymity and encryption.

But just because it is difficult to track criminals individually doesn’t mean it is impossible to conduct surveillance on what they are doing. Polancich wrote that the Dark Web is the place to, “find out what may have been stolen or used against you and improve your overall security posture to close the infiltration hole.”

Is it really that easy?
According to Kevin McAleavey, cofounder of the KNOS Project and a malware expert, “easy” may not be the right word. But “possible” definitely is.

“Can anyone do it? You bet,” he said, “but only if you're willing to pay people to sit around and just surf. Most managers consider that ‘wasting time’ and it's often frowned upon, but it works really well.”
"Can anyone do it? You bet, but only if you're willing to pay people to sit around and just surf."

He said that was one of the things he did in a previous job – “follow the bad guys back to their cave so I could see what they were working on before they released it. But it was one of the most time-consuming parts of being ahead of the curve rather than under it.”

Nicholas Albright, principal researcher, ThreatStream, agrees. “These networks seem obscure to many, but with a simple tutorial, anyone could be up and running in less time than it takes to watch an episode of ‘Mr. Robot’,” he said.

“The hardest part of monitoring is really learning where to look. Many of the sites on these obscure networks move locations or go offline periodically. However, once an individual has identified a handful of sites, they frequently lead to others.”

He also agrees with McAleavey that it is labor-intensive, and does not always yield useful intelligence. On the “slow” days, “you might not see anything of value,” he said. “Furthermore, this requires an analyst's fingers on keyboard. Deploying a 'tool' to do this job is not effective. Scraper bots are detected and regularly purged.”
"Nothing can replace direct monitoring of your own networks and assets."

Others are a bit more dubious about the average IT department doing effective Dark Web surveillance, even if the budget is there. “The task of collecting raw information itself is non-trivial,” said Dr. Fengmin Gong, cofounder and chief strategy officer at Cyphort. “And distilling the threat intelligence from the raw data is not any easier. So while it is beneficial to do it, it's not a task that can be undertaken by an average IT department effectively.”

That, he said, is because the average IT worker doesn’t have the expertise to do it, “and it’s not easy to get up to speed. It requires understanding of threats and data mining, which is a high hurdle.”

Fred Touchette, security analyst at AppRiver, is less dubious, but said the deeper the analysis goes, the more expertise is required.

“Initial high-level research should be easily executed by any research team that knows its way around implementing Tor (The Onion Router),” he said. “Once one gets a basic understanding of how Tor is implemented and how to use it, the Dark Web is nearly as easy to navigate, albeit much slower than the regular internet.”
"Once one gets a basic understanding of how Tor is implemented and how to use it, the Dark Web is nearly as easy to navigate, albeit much slower than the regular internet."

“And once research goes beyond passive and into trying to find and possibly purchase samples, things could get pricey,” he said. “Depending on the merchant, sometimes free samples can be obtained, but not always. From here, the same tools and expertise would be required to analyze samples.”

Easy or difficult, most experts agree that enterprises monitoring the Dark Web for threat intelligence is not yet mainstream. “I am aware of technology researchers and developers proposing this as a complementary means to security threat monitoring, but it's not very common as an initiative taken by enterprises themselves,” Gong said.

That may change, however, as more tools become available to make surfing the Dark Web easier.

Juha Nurmi, writing on the Tor Blog, said he has been working since 2010 on developing Ahmia, an open-source search engine for Tor hidden service websites.

And Eric Michaud, founder and CEO of Rift Recon, is also CEO and cofounder of DarkSum, which launched just last week and is promoting a search engine that it calls “Google for the Dark Net.”

Michaud agrees with Gong that effective surveillance of the Dark Net would be beyond the capability of most organizations smaller than Fortune 100. But he said with a search engine like DarkSum that indexes the Dark Net, they can do it. “We make it easy,” he said.

McAleavey said he has already done it. “All it really takes is setting up a couple of machines to crawl the Tor network with a dictionary list of interesting keywords to match up with, and then let it rip,” he said.

“Once the results have been put into the database of what was found and where, human analysts can then fire up a Tor browser and check out what the crawler found. The more keywords you have, the more results you'll get, and the more people you have to rifle through it all, the better the chances of finding the needles in that haystack.”

Of course, indexing the Dark Web is not static. As McAleavey notes, sites on the Tor network, “often change their address every few hours or every few days, so you need to crawl again looking for those sites of interest because they probably moved since the last time you crawled.”

Michaud agreed, but said it is possible to keep up with address changes. While he wouldn’t discuss the techniques his company uses to do it, “we do it really well,” he said.

Whether it is worth the time and expense to conduct Dark Web surveillance is also a matter of debate. Gong contends that while it is helpful as a “layer” of security, it is not easy to do well. “It requires both sophisticated infrastructure and technical skills that are not trivial to establish,” he said, adding that, “it is not very crucial or affordable for an enterprise IT to pull off by itself.”

And he believes there is, “nothing that can replace direct monitoring of your own networks and assets.”

But Michaud said as it becomes easier and cheaper, it will be a necessary part of a security operation. “Enterprises are scared,” he said, “because they know they will be held responsible for data breaches if they aren’t proactive.

“If you’re just being defensive, you’re going to have a bad day.”

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com



Friday, 14 August 2015

Why SharePoint is the last great on-premises application

While it seems like almost every piece of IT is moving to cloud these days, there are still plenty of reasons to keep SharePoint in your server room – where it belongs.

At the Worldwide Partner Conference (WPC) last month in Orlando, we heard many of the same grumblings we’ve been hearing about Microsoft for years now: They don’t care about on-premises servers. They’re leaving IT administrators in the dust and hanging them out to dry while forcing Azure and Office 365 content on everyone. They’re ignoring the small and medium business.

It’s hard to ignore this trend. It’s also true that the cost-to-benefit ratio continues to decrease to the point where common sense favors moving many workloads up to the cloud where you can transform capex and personnel expense to opex that scales up and down very easily.

But SharePoint Server is such a sticky product with tentacles everywhere in the enterprise that it may well be the last great on-premises application. Let’s explore why.

The cloud simply means someone else’s computer

One clear reason is that SharePoint, for so many organizations, hosts a large treasure trove of content, from innocuous memos and agendas for weekly staff meetings to confidential merger and acquisitions documents. In most organizations, human resources uses SharePoint to store employee compensation analysis data and spreadsheets; executives collaborate within their senior leadership teams and any high-level contacts outside the organization on deals that are proprietary and must be secured at all times; and product planning and management group store product plans, progress reports and even backups of source code all within SharePoint sites and document libraries.

No matter how secure Microsoft or any other cloud provider claims it can make its hosted instances of SharePoint, there will always be that nagging feeling in the back of a paranoid administrator’s head: Our data now lives somewhere that is outside of my direct control. It’s an unavoidable truth, and from a security point of view, the cloud is just a fancy term for someone else’s computer.

Not even Microsoft claims that every piece of data in every client tenant within SharePoint Online is encrypted. Custom Office 365 offerings with dedicated instances for your company can be made to be encrypted, and governmental cloud offerings are encrypted by default, but a standard E3 or E4 plan may or may not be encrypted. Microsoft says it is working on secure defaults, but obviously this is a big task to deploy over the millions of servers they run.

Nothing is going to stop the FBI, the Department of Justice, the National Security Agency or any other governmental agency in any jurisdiction from applying for and obtaining a subpoena to just grab the physical host that stores your data and walk it right out of Microsoft’s data center into impound and seizure. Who knows when you would get it back? Microsoft famously does not offer regular backup service of SharePoint, relying instead on mirror images and duplicate copies for fault tolerance, and it’s unclear how successful you’d be at operating on a copy of your data nor how long it would take to replicate that data into a new usable instance in the event of a seizure.

Worse, you might not even know that the government is watching or taking your data from SharePoint Online. While Microsoft claims that if possible they’ll redirect government requests back to you for fulfillment, the feds may not let them, and then Microsoft may be forced to turn over a copy of your data without your knowledge. They may get a wiretap as well. And if the NSA has compromised the data flowing in and out of their datacenters with or without Microsoft’s knowledge, then it’s game over for the integrity of your data’s security posture.

It’s tough for many – perhaps even most – Fortune 500 companies to really get their heads around this idea. And while Microsoft touts the idea of a hybrid deployment, it’s difficult and not inexpensive and (at least until SharePoint 2016 is released) a bit kludgy as well. On top of that, wholesale migration of all of your content to the cloud could take weeks and require investment in special tools, increased network connection bandwidth and all of that. All of these reasons validate SharePoint remaining on premises for most places that are already using it.
It’s (sort of) an application development platform

Some companies have taken advantage of SharePoint’s application programming interfaces, containers, workflow and other technologies to build in-house applications on top of the document and content management features. Making those systems work on top of Office 365 and SharePoint Online can be very difficult beast to tame. With the on-premises version of SharePoint, everyone has access to the underlying environment and could tweak and test it. Office 365 requires licenses and federated identities, and doesn’t offer access to IIS and SharePoint application management features.

On top of that, a pure cloud or even a hybrid option still may not be any less expensive than using portions of resources and hardware your company already has…another reason why SharePoint is one of the last remaining applications that will make sense to run on premises for a long time to come.
It’s a choice with less obvious benefits – there is lower-hanging fruit

Email is still the slam dunk of cloud applications. Your organization derives no competitive advance, no killer differentiation in the marketplace from running a business email server like Microsoft Exchange. It is simply a cost center – no one is building applications on top of email, no one is improving or innovating on email in a way that would mean it made sense to keep that workload in your own datacenter. Secure email solutions exist now that encrypt transmissions and message stores both at rest and in transit, so security in the email space is much more mature than, say, hosted SharePoint. No wonder Exchange Online is taking off.

SharePoint is not as clear a case here. While you might choose to put your extranet on SharePoint Online or host a file synchronization solution in the cloud, there are enough reasons not to move SharePoint into the cloud for a variety of audiences and corporations big and small that should see SharePoint on premises long after most everything else has been moved over to Somebody Else’s Computer™.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com