Wednesday, 28 October 2015

Exam 70-355 Universal Windows Platform – App Data, Services, and Coding Patterns (beta)

Exam 70-355
Universal Windows Platform – App Data, Services, and Coding Patterns (beta)

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.


Recognize and apply a specified design pattern
Describe the relationship between architecture, design pattern, and framework
Recognize common architectures and when they should be used, recognize common design patterns and when a pattern can be applied to make programming tasks faster and easier

Describe traditional Microsoft .NET design patterns
Describe the Gang of Four design patterns, including creational patterns, structural patterns, and behavioral patterns; describe 3-tier/N-tier patterns; describe enterprise patterns; describe cloud design patterns; describe head first patterns; describe repository patterns; describe unit of work patterns

Apply the Model-View-ViewModel (MVVM) Prism pattern
Separate concerns, develop the views for the MVVM app, develop the view-models for the MVVM app, develop the models for the MVVM app, develop class interactions and data binding for the MVVM app

Develop app and business logic, code that interfaces with other line-of-business (LOB) apps, and LOB Server Services (AD, SP)

Develop code for app-specific processes and computations
Create an asynchronous method or process, managing the return value from an asynchronous method, debugging and error handling for an asynchronous method, develop storyboards and custom animations for an object, represent 3-D models as code objects, manage 2-D projections of 3-D objects, use Task, ThreadPool, and background transfers

Implement background tasks
Create a background task, register a background task, set and respond to triggers, debug a background task, implement a lock screen app, share data/events between an app and its background tasks; directly calling a background task

Manage app lifecycle events
Prepare for suspension, resume from suspension or termination, implement an extended splash screen, extend execution and monitor suspension errors

Implement interactions with other apps
Integrate a share contract to share content with another app, integrate contact and appointment functionality, implement mapping and navigation (geolocation, geofencing, and Bing Maps), exchange data/file between apps, including launch for result; use drag and drop

Implement notifications and Windows Push Notification Services (WNS)
Implement and manage notifications; support Live Tile updates, including toasts and badges, support Action Center and secondary tiles

Implement interactions with devices
Develop code for camera and microphone, including photo, video, and audio; implement screen capture; implement printing and Play To; integrate HoloLens sensors and services; support

wireless communication
Develop class libraries (code libraries, DLLs)
Naming assemblies, namespaces, types, and members in class libraries; using static and abstract classes, interfaces, enumerations, structures, and other types; designing and using properties, methods, constructors, fields, events, operators, and parameters; implementing extensibility mechanisms such as subclassing, using events, virtual members, and callbacks; designing, throwing, and catching exceptions

Develop code for implementing secure cloud data services and storage

Design and implement data roaming
Roaming user settings and preferences, roaming app session info
Design and implement a RESTful data solution (oData, JSON)
Using the ASP.NET Web API, implementing JSON serialization, adding a service reference to the project, using Windows.Web.Http.HttpClient objects
Design and implement Azure and cloud data sources
Implement offline data sync, implement caching, support OneDrive integration, implement file access and management (including File Picker and file access APIs), upload images to Azure Storage

Integrate Azure data services
Call a custom Cloud Service API from a client, schedule backend jobs in Mobile Services
Design and implement removable and embedded local data sources
Support SD card storage, implement SQLite on mobile devices

Develop code to implement authentication and business security requirements
Implement code to manage authentication and identity
Web authentication broker; Azure authentication; implement code to manage identity; implement biometric identity verification, including Windows Hello; implement Credential Locker, implement single sign-on
Implement code to manage authorization and access to resources
Implement authentication requests; authorize users and apps; manage authorization IDs; restrict access to resources, including data, files, folders, and devices
Implement cryptography within an app
Create cryptographic keys, hash and sign content, create message authentication codes, encrypt and decrypt data
Support enterprise security considerations
Implement security transparency, implement code access security, implement role-based security

Integrate cloud services and Azure App Service services
Build native and cross-platform apps using services
Integrate Azure App Service mobile app functionality within an existing mobile app, use a .NET client with Mobile Services, call a custom API from a client
Connect to your enterprise systems using services
Build a service that uses an existing SQL database, connect to an on-premises SQL Server from an Azure mobile service using hybrid connections, scale mobile services backed by Azure SQL database, authenticate your app with Active Directory Authentication Library single sign-on, add role-based access control to mobile services with Azure Active Directory, access Microsoft SharePoint on behalf of the user, schedule backend jobs in mobile services, troubleshoot a mobile services .NET backend
Connect to SaaS APIs using services
Implement single sign-on using credentials from third-party identity providers, build a service that uses MongoDB as a data store
Build offline-ready apps with sync using services
Allow employees to work offline when connectivity is not available, synchronize with your enterprise backend systems when devices comes back online, recover in the event of a disaster
Push notifications to users using services
Add push notifications to your app, send push notifications to authenticated users

Develop code that is maintainable and that supports app versioning, compatibility, and coexistence
Develop code using version control (TFVC or Git)
Develop code using a standardized coding convention, implement best practices for assembly versioning
Implement best practices for assemblies and side-by-side execution
Use strong-named assemblies, including version, culture, and publisher; use the GAC to provide version-aware storage; create an app that runs in isolation
Implement best practices for assembly placement and the GAC
Using an app configuration file, using codebases, providing a binding context


Wednesday, 21 October 2015

10 key moments in the history of Apple and Microsoft

Apple and Microsoft recently renewed their alliance with the goal of tackling the enterprise market, but the latest partnership is just the most recent turning point in the two companies' intertwined histories. Here are the defining moments that led up to the new pact.

Apple and Microsoft's history of highs and lows
Apple and Microsoft share a common history and bond in the evolution of personal computing. Relations between the two technology pioneers were generally cordial when they were founded in the 1970s, but that sense of mutual respect quickly turned to discord. The founders of both companies were at loggerheads often in the past. Today their new leaders appear determined to bury the hatchet and partner for greater opportunities in the enterprise.

Youthful innocence of the early '80s
Microsoft was a critical Apple ally during the first Macintosh's development. At an Apple event in 1983, Microsoft CEO Bill Gates told attendees Microsoft expected to earn half of its revenues selling Macintosh software the following year. He called the Macintosh, "something that's really new and really captures people's attention."

Jobs ousted from Apple, forms NeXT
In 1985, Apple CEO Steve Jobs was ousted from the company he cofounded nine years earlier. He immediately sold all but one share in Apple to fund the launch of NeXT, where he would spend the next 12 years building computer workstations for higher education and business.

Jobs says Microsoft has 'no taste'
"The only problem with Microsoft is they just have no taste," Jobs said in the 1996 "Triumph of the Nerds" TV documentary. "They have absolutely no taste. And I don't mean that in a small way, I mean that in a big way, in the sense that they don't think of original ideas, and they don't bring much culture into their products."

Jobs returns to Apple, partners with Microsoft
When Apple acquired NeXT in 1997 and brought Steve Jobs back into the fold, the company was in disarray amid growing uncertainty about the future of Microsoft Office for Mac. During his keynote address at the Macworld Expo that year, Jobs extolled the virtues of partnering with industry leaders and spoke of the need to improve Apple's partner relations.

Gates addresses the Apple faithful in 1997
"Microsoft is going to be part of the game with us as we restore this company back to health," Jobs said at Macworld, before asking Gates to address the crowd via satellite.

"We think Apple makes a huge contribution to the computer industry," Gates said. "We think it's going to be a lot of fun helping out."

Gates and Jobs take the stage together in 2007
A seminal moment occurred between the leaders of both companies when Gates and Jobs jointly took the stage for an interview at the D5 conference. Both men praised each other in their own ways. Jobs commended Gates for building the first software company in the world, but Gates was more flattering. "What Steve's done is quite phenomenal," he said.

'Memories longer than the road ahead'
When Jobs was asked to describe the greatest misunderstanding of his relationship with Gates, he said: "I think of most things in life as either a Bob Dylan or a Beatles song, but there's that one line in that one Beatles song — 'You and I have memories longer than the road that stretches out ahead' — and that's clearly very true here."

Apple invites Microsoft exec on stage for iPad demo
A new era of partnership buoyed by opportunities in the enterprise began to blossom in the early-2010s. At Apple's September 2015 new product event in San Francisco, the company invited Kirk Koenigsbauer, vice president of Microsoft Office, on stage to demonstrate Office 365 apps working in split-screen mode on an iPad Pro.

Microsoft CEO uses iPhone at Dreamforce
At Salesforce's 2015 Dreamforce conference, Microsoft CEO Satya Nadella demoed the company's iOS apps on an iPhone. When Nadella did the once unthinkable, using an iPhone on stage, he acknowledged it as such but also made clear that it wasn't his phone. "It is a pretty unique iPhone," he said. "I like to call it the iPhone Pro because it has all the Microsoft software and applications on it … It's pretty amazing."

Apple CEO Tim Cook doesn't hold a grudge
During a keynote at cloud-storage company Box's BoxWorks conference in September 2015, when asked about the company's renewed relationship with Microsoft, Apple CEO Tim Cook said he doesn't believe in holding grudges. "If you think back in time, Apple and IBM were foes. Apple and Microsoft were foes," Cook said. "Apple and Microsoft still compete today, but frankly Apple and Microsoft can partner on more things than we could compete on, and that's what the customer wants."


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Sunday, 11 October 2015

5 ways to shore up security in your BYOD strategy

You’d think after all this time that organizations would have finally gotten BYOD programs pretty much down pat. Don’t bet on it.

A recent study by tyntec reveals that a vast majority of organizations still have inadequate bring-your-own-device (BYOD) policies. That’s not very encouraging, considering that 49 percent of workers now use a personal mobile device for work-related tasks and spend a great deal of time on personal devices for their job.

Further, the typical U.S. worker now expects to have nothing less than total access – anywhere, anytime, from any device – to their employer’s networks, finds another study from Dell and Intel. But despite all this demand on the user side, many organizations still wrestle with security, privacy and support issues around BYOD. That is holding many employers back when it comes to giving BYOD an enthusiastic ‘thumbs up’.

So what does it take to get BYOD right in 2015? CSO put that question to a few IT leaders, whose collective responses reflect the still wide divide on how BYOD is supported at the IT executive level, possibly depending on the industry in which they work.

An undeniable force

The higher education sector has embraced BYOD probably as much as any. No surprise here, really. College and university culture is all about openness – of ideas, of expression, and of access to resources. So it is only natural that today’s campus environment is awash with personal devices.

The University of Tennessee at Chattanooga is a prime example. According to Thomas Hoover, associate vice chancellor and CIO, and Susan Lazenby, manager of strategic planning and communication, BYOD has taken the campus by storm.

The two shared the school’s experiences with BYOD by stressing the impact it has had on the school’s IT organization, including staff and budget. But they confirmed that BYOD was a trend not to be denied, and the university had no choice but to adopt it. They also noted that a robust BYOD program is not just demanded by students, but also by faculty and employees.

To illustrate how rapidly BYOD caught on at UT, the two noted that five years ago the school’s network was supporting 809 devices. That number rose to 14,906 in 2014. This year it jumped to approximately 48,000.
It’s a similar tale hundreds of miles away at Worcester State University in Massachusetts.
“Like any other institute in higher education, Worcester State doesn’t have any choice but to support BYOD,” notes Anthony (Tony) Adade, CIO at the university. “The students come from diverse backgrounds. They come with all kinds of devices. For several years we’ve been seeing an influx of games on our campus – all kinds of games. Besides the normal devices that we have to deal with, we didn’t have any choice but to support them.”

Like at the University of Tennessee, wide-scale BYOD has been a fairly new phenomenon at Worcester State, but demand quickly made up for lost time.

“Initially it was limited. The network itself was at capacity and was not able to handle the devices coming on campus,” Adade explains. “We had to tell some students that they can’t bring devices on campus or if they did they were on their own. However, later on we realized it would be in our strategic interest to have a plan and to address the issue. Now we can safely accommodate almost every device. “

Colleges and universities aren’t the only organizations that have felt compelled to adopt BYOD programs, of course. Countless companies and nonprofits are also supporting programs, and have learned some important lessons in how to do it right.

“It is important to have technology in-house to support BYOD strategy,” notes Christine Vanderpool, CIO at Molson Coors, one of the nation’s leading brewers. “Companies should invest in tools like MDM, DLP and application monitoring (tools that inform the user of malicious applications on their devices). You need staff to support these tools. You need a strong set of policies, procedures and end user education.”

“It is good to focus on the ‘what’s in it for them’ in most cases,” Vanderpool stresses. “If you deploy MD or application controls, you have to explain how this is protecting them in their daily life and not just in their work life.”

What are the most important elements of an effective BYOD program in terms of both providing employee flexibility and productivity and also ensuring company data and network security? Molson Coors CIO Christine Vanderpool offers the following tips on what should be considered: Identified risks include:

“Give real life examples like how some malicious apps can take control/read all the user’s SMS text messages, see password information entered into a bank app, etc. People care most when they can understand it and can potentially impact their lives beyond just their job,” Vanderpool says.

Not everyone’s a believer

But many CIOs remain skeptics when it comes to supporting BYOD, fearing that the probable risks still outweigh the possible benefits. One of them is Jim Motes, vice president and CIO at Rockwell Automation.

“I'm not really a fan of BYOD phones,” Motes says. “I believe the privacy constraints will be at odds with protecting and controlling corporate intellectual property.”

“The smartphone is not just communication technology, it's a social lifeline, diary, and entertainment system,” Motes continues. “People have too much personal information stored on these systems and should be very careful about how much access they want to give their employers. Employers should avoid them completely to limit their liability should that personal information be breached and exposed.”

So how does an organization resolve these two competing forces: security and privacy concerns on one hand, versus user demand for convenience on the other?

Our sources offered the following combined tips on how to get BYOD right:

Have a thoughtful strategy
As noted, security remains a top concern for IT leaders when it comes to BYOD. It is therefore important to involve the IT security team in establishing a program from the outset. But the approach should be for the CSO to help find a solution, not reasons to not support it. The focus should be on how to best secure the data first and foremost, then the devices.

Take stock of the situation
Once you’ve set your strategy, begin with assessments of the network capacity and the security status. Issues to consider include how much vulnerability does the network have? Who is connecting to it? What devices and applications are they using?

Have a clear set of policies and expectations
You need a set policy of guidelines on what is allowed and what is not and to guide behavior of employees and users. Policies should be simple and easy to understand. Toward that end, have your employees help draft the policies to get their understanding and support up-front.

Some devices are a ‘go’ and some are a ‘no’
Third, identify the devices you wouldn’t be able to support. The program probably can’t be all things to all employees. Create an approved list of devices that IT will support, providing the employee has a valid business reason for using it. Purchase the devices at a reduced cost for employees, and put necessary safeguards on those devices. Let employees know up front to what degree you will support a particular device purchase.

Proper training is critical
Educate employees on how to connect their devices to the network and also the dos and don’ts of their usage. Lunchtime training sessions are a smart idea. Stress what it is that employees are agreeing to, including what happens if a device is lost or stolen – the wiping of the device. Most employees will say yes, and for those that don’t, they can’t participate in the program.

Finally, “BYOD risks and considerations will continue to grow and change just as rapidly as the technologies change,” stresses Vanderpool. “It is vital that all aspects of the BYOD model be continuously reviewed, updated, re-communicated and employees re-educated. The model deployed and the supporting guidelines, policies and procedures implemented to support it must be agile and allow the company to be able to quickly adapt or change them when necessary.”


Tuesday, 29 September 2015

Five steps to optimize your firewall configuration

95% of all firewall breaches are caused by misconfiguration. Here's how to address the core problems

Firewalls are an essential part of network security, yet Gartner says 95% of all firewall breaches are caused by misconfiguration. In my work I come across many firewall configuration mistakes, most of which are easily avoidable. Here are five simple steps that can help you optimize your settings:

* Set specific policy configurations with minimum privilege. Firewalls are often installed with broad filtering policies, allowing traffic from any source to any destination. This is because the Network Operations team doesn’t know exactly what is needed so start with this broad rule and then work backwards. However, the reality is that, due to time pressures or simply not regarding it as a priority, they never get round to defining the firewall policies, leaving your network in this perpetually exposed state.

You should follow the principle of least privilege – that is, give the minimum level of privilege the user or service needs to function normally, thereby limiting the potential damage caused by a breach. You should also document properly – ideally mapping out the flows that your applications actually require before granting access. It’s also a good idea to regularly revisit your firewall policies to look at application usage trends and identify new applications being used on the network and what connectivity they actually require.

* Only run required services. All too often I find companies running firewall services that they either don’t need or are no longer used, such as dynamic routing, which typically should not be enabled on security devices as best practice, and “rogue” DHCP servers on the network distributing IPs, which can potentially lead to availability issues as a result of IP conflicts. It’s also surprising to see the number of devices that are still managed using unencrypted protocols like Telnet, despite the protocol being over 30 years old.

The solution is to harden devices and ensure that configurations are compliant before devices are promoted into production environments. This is something a lot of organizations struggle with. By configuring your devices based on the function that you actually want them to fulfil and following the principle of least privileged access – before deployment – you will improve security and reduce the chances of accidentally leaving a risky service running on your firewall.

* Standardize authentication mechanisms. During my work, I often find organizations that use routers that don’t follow the enterprise standard for authentication. One example I encountered is a large bank that had all the devices in its primary data centers controlled by a central authentication mechanism, but did not use the same mechanism at its remote office. By not enforcing corporate authentication standards, staff in the remote branch could access local accounts with weak passwords, and had a different limit on login failures before account lockout.

This scenario reduces security and creates more opportunities for attackers, as it’s easier for them to access the corporate network via the remote office. Enterprises should therefore ensure that any remote offices they have follow the same central authentication mechanism as the rest of the company.

* Use the right security controls for test data. Organizations tend to have good governance stating that test systems should not connect to production systems and collect production data, but this is often not enforced because the people who are working in testing see production data as the most accurate way to test. However, when you allow test systems to collect data from production, you’re likely to be bringing that data down into an environment with a lower level of security. That data could be highly sensitive, and it could also be subject to regulatory compliance. So if you do use production data in a test environment, make sure that you use the correct security controls required by the classification the data falls into.

* Always log security outputs. While logging properly can be expensive, the costs of being breached or not being able to trace the attack are far higher. Failing to store the log output from their security devices, or not doing so with enough granularity is one of the worst things you can do in terms of network security; not only will you not be alerted when you’re under attack, but you’ll have little or no traceability when you’re carrying out your post-breach investigation. By ensuring that all outputs from security devices are logged correctly organizations will not only save time and money further down the line but will also enhance security by being able to properly monitor what is happening on their networks.

Enterprises need to continuously monitor the state of their firewall security, but by following these simple steps businesses can avoid some of the core misconfigurations and improve their overall security posture.


Sunday, 13 September 2015

Get ready to live in a trillion-device world

A swarm of sensors will let us control our environment with words or even thoughts

In just 10 years, we may live in a world where there are sensors in the walls of our houses, in our clothes and even in our brains.

Forget thinking about the Internet of Things where your coffee maker and refrigerator are connected. By 2025, we could very well live in a trillion-device world.
[ Stay up to date on tech news with Computerworld's daily newsletters. ]

That's the prediction from Alberto Sangiovanni-Vincentelli, a professor of electrical engineering and computer science at the University of California at Berkeley.

"Smartness can be embedded everywhere," said Sangiovanni-Vincentelli. "The entire environment is going to be full of sensors of all kinds. Chemical sensors, cameras and microphones of all types and shapes. Sensors will check the quality of the air and temperatures. Microphones around your environment will listen to you giving commands."

This is going to be a world where connected devices and sensors are all around us -- even inside us, Sangiovanni-Vincentelli said in an interview with Computerworld during DARPA's Wait, what? Forum on future technology in St. Louis this week.

"It's actually exciting," he said. "In the next 10 years, it's going to be tremendous."

According to the Berkeley professor and researcher, we won't have just smartphones.

We'll have a swarm of sensors that are intelligent and interconnected.

Most everything in our environment -- from clothing to furniture and our very homes -- could be smart. Sensors could be mixed with paint and spread onto our walls.

We'll just speak out loud and information will instantly be given to us without having to do an online search, phone calls can be made or a robot could start to clean or make dinner.

And with sensors implanted in our brains , we wouldn't even need to speak out loud to interact with our smart environment.

Want something? Just think about it.

"The brain-machine interface will have sensors placed in our brains, collecting information about what we think and transmitting it to this complex world that is surrounding us," said Sangiovanni-Vincentelli. "I think I'd like to have an espresso and then here comes a nice little robot with a steaming espresso because I thought about it."

Pam Melroy, deputy director of DARPA's Tactical Technology Office, said the Berkeley professor isn't just dreaming.

"I do think there's something to that" scenario, said Melroy, who is a retired U.S. Air Force officer and former NASA astronaut. "At the very least, we should be preparing for it and thinking of what is needed. We get into very bad places when technology outstrips our planning and thinking. I'd rather worry about that and prepare for it even if it takes 20 years to come true, than just letting it evolve in a messy way."

While having a trillion-device life could happen in as little as 10 years, Sangiovanni-Vincentelli said there's a lot of work to be done to get there.

First, we simply don't have the network we'd need to support this many connected devices.

We would need communication protocols that consume very small amounts of energy and can transmit fluctuating amounts of information, the professor explained. Businesses would need to build massive numbers of tiny, inexpensive sensors. We'll need more and better security to fend off hacks to our clothing, walls and brains.

And the cloud will have to be grown out to handle all of the data that these trillion devices will create.

"Once you have the technology enabling all of this, we should be there in 10 years," said Sangiovanni-Vincentelli.

With all of these devices, many people will be anxious about what this means for personal privacy.

Sangiovanni-Vincentelli won't be one of them, though.

"Lack of privacy is not an issue," he said. "We've already lost it all... If the government wants me now, they have me. Everything is already recorded somewhere. What else is there to lose?"

Melroy also is more excited than nervous about this increasingly digital future.

"As a technologist, I don't fear technology," she said. "I think having ways that make us healthier and more efficient are a good thing... There is social evolution that happens with technological evolution. We once were worried about the camera and the privacy implications of taking pictures of people. The challenge is to make the pace of change match the social evolution."


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Monday, 31 August 2015

The 15 biggest enterprise ‘unicorns’

The Wall Street Journal found 115 companies valued at more than $1 billion, these are the 15 biggest enterprise tech ones

Yester-year there were only a few unicorns in the world of startups.

This week though, the Wall Street Journal and Dow Jones VenturSource identified 115 companies with valuations north of $1 billion, which are referred to as unicorns.

Below are 15 of the highest valued enterprise software companies that have received venture funding but have not yet been sold or gone public.

Palantir
Valuation: $20 billion
Funding: $1.5 billion

What it does: Palantir has created a program that’s really good at finding relationships across vast amounts of data, otherwise known as link analysis software. Its meteoric rise has been fueled by big-money contracts with federal government agencies. Palantir is the second-largest unicorn, behind Uber, that The Wall Street Journal identified.

Dropbox
Valuation: $10 billion
Funding: $607 million

What it does: One of the pioneers of the cloud market, Dropbox’s file synch and share system has been a hit with consumers, and increasingly with businesses too. Chief competitor Box would have been a unicorn, but the company went public this year.

Zenefits
Valuation: $4.5 billion
Total funding: $596 million

What it does: Zenefits provides a cloud-based human resource management (HRM) system for small and midsized business, with an emphasis on helping businesses manage health insurance administration and costs.

Cloudera
Valuation: $4.1 billion
Total funding: $670 million

What it does: Cloudera provides a distribution of Hadoop. It’s chief competitor in the big data/Hadoop market, Hortonworks, filed for an initial public offering earlier this year after being a unicorn itself.
Resources

Pure Storage
Valuation: $3 billion
Funding: $530 million

What it does: Pure storage is one of the most popular startups in the solid-state, flash-storage market. It pitches its hardware-software product as a more affordable competitor to storage giant EMC.

Docusign
Valuation: $3 billion
Funding: $515 million

What it does: Docusign lets users electronically sign and file paperwork.

Slack
Valuation: $2.8 billion
Funding: $315 million

What it does: Slack is an enterprise communication and collaboration platform, allowing users to text and video chat, plus share documents too.

Nutanix
Valuation: $2 billion
Funding: $312 million

What it does: Nutanix is one of the startups in the hyperconvernged infrastructure market, providing customers an all-in-one system that includes virtualized compute, network and storage hardware, controlled by a custom software. Converged systems are seen as the building blocks of distributed systems because of their ability to optimize performance, particularly on the storage side.

Domo
Valuation: $2 billion
Funding: $459 million

What it does: Founded by Josh James (who sold his previous startup Omniture to Adobe for $1.8 billion), this Utah-based company provides business intelligence software hosted in the cloud tailored for business executives. The idea is to provide c-level executives at companies ready access to important data they need to run their companies in a user-friendly format accessible on any device.

GitHub
Valuation: $2 billion
Funding: $350 million

What it does: GitHub is a platform for storing software that makes up open source projects. These repositories can be public or private and allow users to track bugs, usage and downloads. If you use an open source project, it’s likely hosted on GitHub.

Tanium
Valuation: $1.8 billion
Funding: $142 million

What it does: Tanium is a platform for identifying and remedying application outages or security threats in real-time. One of it biggest differentiating features is an intuitive search bar that allows users to quickly search in natural language to check the status of the system they’re monitoring for a variety of issues.

MongoDB
Valuation: $1.6 billion
Funding: $311 million

What it does: MongoDB is one of the most popular NoSQL databases. These new breeds of databases are ideal for managing unstructured data, like social media streams, documents and other complex data that don’t fit well into traditional structured databases.

InsideSales.com
Valuation: $1.5 billion
Funding: $199 million

What it does: InsideSales.com is a big data platform that analyzes business relationships with customers and provides predictive analytics for future sales strategy.

Mulesoft
Valuation: $1.5 billion
Funding: $259 million

What it does: Mulesoft is the commercial product for the open source Mule software, an enterprise service bus that helps integrate and coordinate data across applications. Having a common data set that multiple applications can use reduces duplication and cost.

Jasper Technologies
Valuation: 1.4 billion
Funding: $204 million

What it does: Jasper Technologies creates a platform for the budding Internet of Things. The company’s software allows data generated by machines to be stored and analyzed in the company’s software.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Wednesday, 19 August 2015

How to uncover the Dark Web

Cybercriminals love the Dark Web because it is almost impossible to track or identify them.

One of the best ways to understand your enemy – what he’s up to, what his capabilities are and how he can damage you – is to spy on him.

And according to some cybercrime experts, one of the easier and more effective ways to do that is to hang out where the bad guys do – on the Dark Web.
security tools 1

In a recent post on Dark Reading, Jason Polancich, founder and chief architect of SurfWatch Labs, asserted that, “most businesses already have all the tools on hand for starting a low-cost, high-return Dark Web intelligence operations within their own existing IT and cybersecurity teams.”

Such a data mining operation, he wrote, could be up and running in a day.

It is widely known in IT circles that the Dark Web is a thriving cybercrime marketplace offering multiple exploits, hacking for hire, stolen personal data and intellectual property, spam and phishing campaigns, insider threats for hire and more.

It is also a relatively secure place for criminals to operate, thanks to randomness, anonymity and encryption.

But just because it is difficult to track criminals individually doesn’t mean it is impossible to conduct surveillance on what they are doing. Polancich wrote that the Dark Web is the place to, “find out what may have been stolen or used against you and improve your overall security posture to close the infiltration hole.”

Is it really that easy?
According to Kevin McAleavey, cofounder of the KNOS Project and a malware expert, “easy” may not be the right word. But “possible” definitely is.

“Can anyone do it? You bet,” he said, “but only if you're willing to pay people to sit around and just surf. Most managers consider that ‘wasting time’ and it's often frowned upon, but it works really well.”
"Can anyone do it? You bet, but only if you're willing to pay people to sit around and just surf."

He said that was one of the things he did in a previous job – “follow the bad guys back to their cave so I could see what they were working on before they released it. But it was one of the most time-consuming parts of being ahead of the curve rather than under it.”

Nicholas Albright, principal researcher, ThreatStream, agrees. “These networks seem obscure to many, but with a simple tutorial, anyone could be up and running in less time than it takes to watch an episode of ‘Mr. Robot’,” he said.

“The hardest part of monitoring is really learning where to look. Many of the sites on these obscure networks move locations or go offline periodically. However, once an individual has identified a handful of sites, they frequently lead to others.”

He also agrees with McAleavey that it is labor-intensive, and does not always yield useful intelligence. On the “slow” days, “you might not see anything of value,” he said. “Furthermore, this requires an analyst's fingers on keyboard. Deploying a 'tool' to do this job is not effective. Scraper bots are detected and regularly purged.”
"Nothing can replace direct monitoring of your own networks and assets."

Others are a bit more dubious about the average IT department doing effective Dark Web surveillance, even if the budget is there. “The task of collecting raw information itself is non-trivial,” said Dr. Fengmin Gong, cofounder and chief strategy officer at Cyphort. “And distilling the threat intelligence from the raw data is not any easier. So while it is beneficial to do it, it's not a task that can be undertaken by an average IT department effectively.”

That, he said, is because the average IT worker doesn’t have the expertise to do it, “and it’s not easy to get up to speed. It requires understanding of threats and data mining, which is a high hurdle.”

Fred Touchette, security analyst at AppRiver, is less dubious, but said the deeper the analysis goes, the more expertise is required.

“Initial high-level research should be easily executed by any research team that knows its way around implementing Tor (The Onion Router),” he said. “Once one gets a basic understanding of how Tor is implemented and how to use it, the Dark Web is nearly as easy to navigate, albeit much slower than the regular internet.”
"Once one gets a basic understanding of how Tor is implemented and how to use it, the Dark Web is nearly as easy to navigate, albeit much slower than the regular internet."

“And once research goes beyond passive and into trying to find and possibly purchase samples, things could get pricey,” he said. “Depending on the merchant, sometimes free samples can be obtained, but not always. From here, the same tools and expertise would be required to analyze samples.”

Easy or difficult, most experts agree that enterprises monitoring the Dark Web for threat intelligence is not yet mainstream. “I am aware of technology researchers and developers proposing this as a complementary means to security threat monitoring, but it's not very common as an initiative taken by enterprises themselves,” Gong said.

That may change, however, as more tools become available to make surfing the Dark Web easier.

Juha Nurmi, writing on the Tor Blog, said he has been working since 2010 on developing Ahmia, an open-source search engine for Tor hidden service websites.

And Eric Michaud, founder and CEO of Rift Recon, is also CEO and cofounder of DarkSum, which launched just last week and is promoting a search engine that it calls “Google for the Dark Net.”

Michaud agrees with Gong that effective surveillance of the Dark Net would be beyond the capability of most organizations smaller than Fortune 100. But he said with a search engine like DarkSum that indexes the Dark Net, they can do it. “We make it easy,” he said.

McAleavey said he has already done it. “All it really takes is setting up a couple of machines to crawl the Tor network with a dictionary list of interesting keywords to match up with, and then let it rip,” he said.

“Once the results have been put into the database of what was found and where, human analysts can then fire up a Tor browser and check out what the crawler found. The more keywords you have, the more results you'll get, and the more people you have to rifle through it all, the better the chances of finding the needles in that haystack.”

Of course, indexing the Dark Web is not static. As McAleavey notes, sites on the Tor network, “often change their address every few hours or every few days, so you need to crawl again looking for those sites of interest because they probably moved since the last time you crawled.”

Michaud agreed, but said it is possible to keep up with address changes. While he wouldn’t discuss the techniques his company uses to do it, “we do it really well,” he said.

Whether it is worth the time and expense to conduct Dark Web surveillance is also a matter of debate. Gong contends that while it is helpful as a “layer” of security, it is not easy to do well. “It requires both sophisticated infrastructure and technical skills that are not trivial to establish,” he said, adding that, “it is not very crucial or affordable for an enterprise IT to pull off by itself.”

And he believes there is, “nothing that can replace direct monitoring of your own networks and assets.”

But Michaud said as it becomes easier and cheaper, it will be a necessary part of a security operation. “Enterprises are scared,” he said, “because they know they will be held responsible for data breaches if they aren’t proactive.

“If you’re just being defensive, you’re going to have a bad day.”

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com