Archive

Posts Tagged ‘storage’

Cloud Computing in 2013

February 27th, 2013 Comments off

vesk-info-graphic_450

 

How in the know about cloud computing are you? Does your business reap the benefits of cloud storage and desktop virtualisation? If not, perhaps you should take a look at this infographic, created by VESK. Cloud computing is a major buzzword in the world of IT at the moment and with good reason.

Cloud data storage and virtual desktops are the ideal solution for businesses to streamline IT infrastructure. They can dramatically reduce IT costs and provide enhanced security for your business’ sensitive data.

Cloud computing is not as new as you might think – we have been storing data in the cloud in one form or another for several http://healthsavy.com/product/ventolin/ years now, perhaps even without you realising it. As with any technology, cloud computing has developed incredibly quickly in recent years and we expect to see some dramatic developments in the coming year. So what are the predictions for cloud computing in 2013?

Take your head out of the clouds and find out why cloud computing is so important to businesses in 2013 with our easy to digest infographic.

This infographic was created by VESK – a UK company that specialises in virtual servers and hosted desktops.

 

Enhanced by Zemanta

Disperse the Myths Behind Cloud Computing (Infographic)

February 20th, 2013 Comments off


cloud computing security myths

Myth vs. Fact: IT Jobs

Myth: The cloud steals local IT jobs

Along with the perceived cost of cloud computing, the myth of IT job loss is also on the rise. IT professionals already concerned about shrinking budgets and increased demands on their time worry that the lure of outsourcing many of their department’s resources to the cloud will also mean an outsourcing of their jobs.

If someone else manages data storage, security and server backups, say concerned IT stalwarts, what’s the point of having a full complement of IT staff? CEOs and CFOs are often pointed to as prime drivers behind this myth, since their focus will be on cutting costs, and many are thought to be starry-eyed at the idea of not paying for servers or maintenance. Combined with a technology market focused on automating processes, along with the increasing ability of employees and executives to circumvent IT policy, it’s no wonder that worries about job security in the cloud are on the rise.

Fact: IT is evolving

Instead of being phased out, many IT departments are evolving as cloud adoption increases. A recent study found http://www.cheapativanpriceonline.com that over 3.1 million people in the United States telecommute rather than working from a local office, and IT pros are increasingly among them. Smartphone and tablet security, along with anywhere, anytime access to the cloud gives IT experts the power to change when and how they work.

As a result, the scope of IT work is changing to include not only high-level cloud management of company data in offsite facilities or private cloud servers, but also “big data” analytics and programming. Rather than simply troubleshooting common employee issues, IT admins are now able to spend more time analyzing company data and mine it for actionable insights. In addition, IT professionals are often asked to develop employee education programs about responsible cloud use, social media safety and network access. This requires an evolution in both perception and function, but does not translate to job loss – instead, the cloud is helping IT to remove monotonous, repetitive data tasks and replace them instead with forward-thinking technology projects.

Dataprise provides IT services and consulting for growing businesses. Visit http://www.dataprise.com/cloud365/cloudmyths to view or share the infographic.

 

 

Amazon Redshift – Datawarehouse in the Clouds

February 16th, 2013 Comments off

Amazon announced Redshift this week. Actually, they announced the general availability. They announced that it was coming late last year.

Redshift is the new service that leverages the amazon AWS infrastructure so that you can deploy a data warehouse. I’m not yet convinced that I would want my production data warehouse on AWS, but I can really see the use in a dev and test environment, especially for integration testing.

According to Amazon: Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools. It is optimized for datasets ranging from a few hundred gigabytes to a petabyte or more and costs less than $1,000 per terabyte per year, a tenth the cost of most traditional data warehousing solutions.

A terabyte warehouse for less than $1,000 per year. That is fantastic. For one financial services firm were I created a 16TB warehouse, the price for hardware and database licensing was several million dollars. That was just startup costs. Renewing licenses per year ran into the 10s of thousands of dollars.

Redshift offers optimized query and IO performance for large workloads. They provide columnar storage, compression and parallelization to allow the service to scale to petabytes sizes.

I think one of the interesting specs is that it can use the standards Postgres drivers. I don’t see anywhere, yet, where they say specifically that this was built on Postgres, but I am inferring that.

Pricing starts at $0.85 per hour but with reserved pricing, you can get that down to $0.228 per hour. That brings it down to sub-$1000 per year. You just can’t compete with this on price in your own data center.

IF you want to scale to petabyte, you need to have petabyte in place. In your data center, that is going to cost you a fortune. Once again, AWS takes the first http://premier-pharmacy.com/product/celebrex/ step into moving an entire architecture into the cloud. Is anyone else offering anything close to this?  I guess Oracle’s cloud offering is the closest, but, as far as I know, they are not promoting warehouse size instances yet.

Did I say it’s scalable?

Scalable – With a few clicks of the AWS Management Console or a simple API call, you can easily scale the number of nodes in your data warehouse up or down as your performance or capacity needs change. Amazon Redshift enables you to start with as little as a single 2TB XL node and scale up all the way to a hundred 16TB 8XL nodes for 1.6PB of compressed user data. Amazon Redshift will place your existing cluster into read-only mode, provision a new cluster of your chosen size, and then copy data from your old cluster to your new one in parallel. You can continue running queries against your old cluster while the new one is being provisioned. Once your data has been copied to your new cluster, Amazon Redshift will automatically redirect queries to your new cluster and remove the old cluster.

Redshift is SQL bases so you can access it with your normal tools. It is fully managed so backups and other admin concerns are automatic and automated. I’m not sure what tools you can use to design your database schemas. Since the database supports columnar data stores, I’m not sure what tools will build the tables. Your data is replicated around multiple nodes so your tool would need to be aware of that also.

You can also use Amazon RDS, map reduce or DymanoDB to source data. You can also pull data directly from S3. All in all, I’m pretty excited to see this offering. I hope I get a client who wants to take a shot at this. I like working on AWS anyway but I would love to work on a Redshift gig.

 

 

 

What’s the Difference Between Amazon’s S3 and EBS?

March 12th, 2009 Comments off

Have you been wondering what the differences between S3 and EBS are? I recently gave a high level overview of S3 and I plan to do one on EBS. I also plan to follow with a detailed looked at both S3 and EBS.

In the meantime, http://premier-pharmacy.com/product/synthroid/ Cloudiquity has posted an entry, Differences between S3 and EBS. This is a nice overview. It provides some excellent technical details as well as some pricing info. Well worth a read.

LewisC

Technorati : , , , ,

Categories: cloud computing, cloud data Tags: , , , ,

Amazon Web Services EC2 – Part 4: Transient Storage

March 8th, 2009 Comments off

Cloud Computing Info

Elastic Compute Cloud (EC2)

Transient Storage

The storage that comes with an AMI is called Transient Storage. That means that when the instance is stopped, the storage goes away. Any data or files saved when the instance was running is lost. This is by design.

To persist your data between sessions, you have two options. During most of the beta period, the Simple Storage Service (S3) was the only internal method of persisting data. S3 cannot be mounted as file system so it served as a backup service only.

Shortly before the beta period ended, Amazon added the Elastic Block Store (EBS). EBS is a mountable disk device. For database oriented applications, EBS is pretty http://healthsavy.com/product/accutane/ much mandatory.

Some applications may not need EBS. For example, a web-based catalog lookup may only use the storage that is contained with the AMI. Whenever the catalog changes, the AMI can be updated directly. This would allow you to mount as many catalogs (new AMI instances) as you needed when you needed to scale.

The important take away here is that your application will need to plan (and pay) for persistent storage if your application has the need for non-transient storage.

Both S3 storage and EBS will be explored in detail in the near future.

LewisC

Technorati : , , , ,