Archive for the ‘Technology’ Category

New mainframe application tuner from CA Technologies

Friday, September 23rd, 2011

The new mainframe application tuner helps simplify application performance management through automated detection that helps IT control costs by resolving problems before they impact users.

CA Technologies is an IT management software and solutions company.CA Technologies has announced the CA Mainframe Application Tuner, which combines two application performance management (APM) tools with new integration capabilities to help IT organizations proactively pinpoint and resolve performance issues that could reduce user productivity and consume extra system resources.

“We are very pleased with the new functionality in CA Mainframe Application Tuner that helps us prevent problems before our users are impacted,” said Mike Bouros, VP of IT, Mainframe Systems Performance at Morgan Stanley & Co. “We are also excited about its integration into the large CA Technologies portfolio of IT solutions, which will significantly enhance and simplify our application development and performance analysis processes.”CA Mainframe Application Tuner combines the advanced performance analysis and tuning capabilities of TRILOGexpert TriTune with the automated performance management of TRILOGexpert APC for TriTune.

CA Technologies has a non-exclusive, worldwide source agreement to develop, market and support this technology, thus facilitating innovation beyond its previous capabilities.

The new integration in CA Mainframe Application Tuner helps performance managers more quickly and easily identify and mitigate the root causes of application performance inefficiencies in z/OS-based systems to improve response times and lower CPU consumption.

To help streamline APM, development and testing activities, CA Mainframe Application Tuner integrates with other CA Technologies software including:

a. CA Technologies cross-platform APM solution, by automatically providing drill-down details about mainframe performance issues to IT analysts and insulating them from complexities of the applications and operating system.

b. CA Endevor Software Change Manager and key testing solutions including CA InterTest™ and CA SymDump, by automating and simplifying the process by which developers can view and update their programs, and helping to prevent manual errors.

c. CA Mainframe Software Manager, by significantly streamlining the acquisition, installation, deployment and maintenance of CA Mainframe Application Tuner.

Common hand ailments and preventive exercises

Sunday, September 18th, 2011

“my paper” listed a number of common hand ailments arising from prolonged usage of touch-screen devices such as tablets and smartphones, as well as a number of preventive exercises.

Some common hand ailments that my paper listed in its 14 Sep story included:

Common RSI and hand ailments and preventive exercises from my paper.

Common RSI and hand ailments and preventive exercises from my paper.

  • Cubital Tunnel Syndrome
  • Carpal Tunnel Syndrome
  • Trigger Fingers
  • DeQuervain’s Disease

According to the Hand & Reconstructive Microsurgery Centre at the National University Hospital (NUH), the number of people seeking treatment from the hospital for hand fatigue/injury has been on the rise since 2009.

For Carpal Tunnel Syndrome, the number of cases has increased 13% from 337 in 2009 to 382 in 2010. For DeQuervain’s Disease, the number has increased 60% from 160 in 2009 to 256 in 2010.

my paper also quoted a therapist from NUH as observing that ladies have a greater tendency to suffer from Repetitive Strain Injuries (RSI) compared to men. This was due to the smaller size of their wrists and other gender-related reasons, such as hormonal changes, pregnancy and menopause.

The article also featured four simple preventive exercises. Each could be done in sets of three repetitions, lasting about 5 seconds for each repetition.

my paper reports on touch screen epidemic

Saturday, September 17th, 2011

Four months ago, I mused about how my chiropractor was laughing to the bank with the increasing number of patients he was seeing because of the proliferation of tablet computers and touch-screen devices. That anecdotal episode has now been substantiated by “my paper”.

Front page feature in the Chinese section of my paper on 14 September 2011.

Front page feature in the Chinese section of my paper on 14 September 2011.

In the front page feature in the Chinese section of my paper on Wednesday (14 September 2011), my paper reported that the number of people suffering from hand injuries has risen because of the popularity of handheld devices.

It reports that although individual touch screen gestures seem like slight movements on their own, overly frequent use would result in arm fatigue, as well as malaise in different parts of the body.

The feature related how a new touch-screen smartphone owner suffered – within a week – discomfort in her wrists, and numbness cum loss of strength in the fingers. Her Traditional Chinese Medicine (TCM) doctor diagnosed that her affliction was related to the daily usage of her touch-screen device.

What was her usage profile like? She played with the phone every 15 minutes, played games and surfed the net when she rode on the bus, occasionally used it at work to check for information, and sent SMS and email while lying in bed prior to falling asleep.

Does it sound like your usage profile?

The 31-year-old tutor related how her wrists felt bruised, and how she had problems typing and holding drinking cups. Her conditions only improved after regular “Tui Na” (a form of TCM massage) and cutting down on her usage.

Another new touch-screen device user also experienced signs of muscle fatigue and discomfort to the fingers and neck. This 42-year-old chauffeur spends four to five hours every night playing games on the device, sometimes persisting even when his fingers ached.

Use of touch screen devices leads to more hand fatigue and injuries

Use of touch screen devices leads to more hand fatigue and injuries.

A TCM doctor from Income Healthcare observed that the unrestrained usage of touch-screen devices has become the leading cause of muscle fatigue and injury, and the number of patients seeking treatment from clinics for such symptoms has increased gradually.

Because the muscles and nerves cannot withstand the prolonged and repeated typing and gesturing of the fingers necessitated by the use of touch-screen devices, fatigue and injuries to the fingers, wrists, muscles and joints occur, the TCM doctor told my paper.

Perhaps it’s time to examine our own usage of touch-screen devices that are increasingly encroaching into our digital lifestyle.

CA and HyTrust ally to secure virtual and cloud infrastructure

Thursday, September 15th, 2011

Complementary technologies from these technology alliance partners promises to arm organisations with tools to virtualise with confidence.

CA Technologies is an IT management software and solutions company.CA Technologies and HyTrust, which provides solutions for policy management and access control for virtual infrastructure, has announced a collaboration to further improve security and compliance for customers leveraging virtualized systems and cloud infrastructures.

“Virtualization presents one of the most significant security challenges an IT organization faces, with the IT administrator being essentially given the ‘keys to the kingdom’ – access to every system and application. In the move to the cloud, the added complexity and multi-tenancy aspects of the infrastructure bring management, security, and compliance issues to the forefront,” said Eric Chiu, founder and president of HyTrust.

HyTrust Appliance complements CA Access Control by administering various aspects of a virtual infrastructure such as unified access control, policy enforcement and audit-quality logging.

HyTrust also supports the recently announced CA Automation Suite for Clouds by providing the security policy and compliance capabilities needed to improve security of the cloud infrastructure.

Large enterprises continue to explore the benefits of the cloud based on their experiences with virtualization, according to a December 2010 study conducted by Management Insight on behalf of CA Technologies.

This expansion moves them to a more complex infrastructure that demands more sophisticated and automated management and security.

CA ARCserve r16 unifies data protection for enterprise IT

Thursday, September 8th, 2011

CA Technologies today announced CA ARCserve r16, a comprehensive hybrid data protection solution that enables customers and service providers to rigorously safeguard the availability of critical data, applications and services across their increasingly diverse mix of virtual, conventional, and cloud resources.

CA Technologies is an IT management software and solutions company.The comprehensive solution and simplified licensing enable customers to reduce costs and better mitigate risk.

The CA ARCserve r16 unifies data protection across virtual, conventional and cloud resources to ease management of diverse IT Infrastructures.

CA ARCserve r16 delivers this high-value capability with:

  • A modular architecture that unifies diverse data management functions and facilitates integration with third-party management tools and services
  • A simplified single license based on total amount of data being protected
  • Standby replication of virtual machine images
  • A new cloud connection layer that eases management of access to multiple public and private cloud storage resources

It unifies data management functions through a scalable, modular architecture that also facilitates integration with third-party software such as remote monitoring tools, “Today’s IT organizations are spending inordinate time and money to protect critical services—often relying on a mix of ‘point’ backup solutions for conventional servers, virtual infrastructure and cloud resources that leave them exposed to the risk of a highly damaging outage,” James Forbes-May, Vice President of Data Management, APAC at CA Technologies.management platforms, and solutions from service providers, like Amazon Web Services, Microsoft, N-able and LabTech.

CA ARCserve r16 further simplifies management of heterogeneous IT environments by allowing customers to purchase multiple data protection functionalities with a single license based on the total amount of data they need to protect.

This licensing model also makes it easy for service providers to price and package complete data protection and management solutions to their customers. CA ARCserve r16 is also available through traditional licensing models.

Comprehensive Protection for Virtualized Infrastructure

A new offering, CA ARCserve r16 Central Host-Based VM Backup, empowers organizations to protect their virtual infrastructure with image-based host-level protection for VMware vSphere, “Downtime is not an option in today’s highly competitive world, where businesses must be more agile than ever despite finite resources,” said Eric Rockwell, president of centrexIT, a San Diego-based provider of technology management services. “We’re migrating our customers from Symantec to CA ARCserve for a variety of reasons—including its overhead-slashing infinite incrementals functionality, its ready-to-go virtualization support, its reliability and its unbeatable replication capabilities.”as well as full system replication and high availability supporting Microsoft Hyper-V, VMware vSphere and Citrix XenServer.

By providing both full bare-metal recovery and full system failover of complete VMs, it enables organizations to meet their increasingly stringent restore SLAs. Recovery of individual applications and granular recovery of targeted files and folders can be accomplished within minutes.

With the new CA ARCserve r16 Central Virtual Standby, customers can schedule the automatic conversion of image-based recovery points to VMware Virtual Disk or Microsoft Virtual Hard Disk formats. By making these backup images available as standby virtual machines, customers can further speed the recovery of data and services.

Powerful Support for the Cloud

CA ARCserve r16 introduces a common cloud connection layer across all data protection components— including traditional file backup, disk imaging, replication and high availability. “The banks that rely on our data protection services are subject to strict regulatory scrutiny,” said Terry Oehring, founder and CEO of Solis Security, an MSP specializing in providing information security for financial institutions. “By leveraging CA ARCserve, we can reliably and cost-effectively enable our customers to meet this key challenge by accurately targeting the data that they need to protect and securely replicating it offsite in the cloud.”This provides integrated access to hybrid cloud storage, enabling customers to more readily take advantage of public and private cloud services such as Amazon Web Services, Microsoft Windows Azure and Eucalyptus, for purposes such as remote, off-site data protection, archiving and failover.

CA ARCserve r16 also allows customers to use Amazon Elastic Compute Cloud (EC2) as their disaster recovery infrastructure helping to ensuring they can be up and running quickly in the event of a problem with their on-premises infrastructure. This cold standby technology can eliminate the significant costs associated with the purchase of redundant hardware and cloud computing charges as part of a business continuity plan.

CA ARCserve r16 also enables service providers and their customers to protect systems and data both locally (for fast, simple end-user or administrator recovery), and in the cloud (to protect critical data files offsite for disaster recovery).

Additional Enhancements across Entire Solution Set

“Since implementing CA ARCserve as the primary disaster recovery solution to help ensure the availability of key business systems in our highly virtualized environment, we’ve reduced our recovery time from more than three days to less than four hours,” said Prashanth Thirumlai, IT infrastructure manager at The Haskell Company, a $500 million integrated design-build firm based in Jacksonville, Florida. All major components of CA ARCserve r16—including ARCserve Backup, ARCserve D2D and ARCserve Replication and High Availability—have been updated with significant enhancements, including new AES encryption to secure data in transfer and at rest, as well as tighter integration between traditional and image-based backup.

In addition, CA ARCserve continues to offer integrated backup to disk, tape or cloud, storage resource management, infrastructure visualization and data deduplication.

Company byte: Akamai Technologies Inc

Tuesday, September 6th, 2011

Akamai (NASDAQ: AKAM) provides cloud-based services for optimizing Web and mobile content and applications, online HD video, and secure e-commerce.

Combining highly-distributed, energy-efficient computing with intelligent software, Akamai’s global platform aims to transform the cloud into a more viable place to inform, entertain, advertise, transact and collaborate.

Akamai has deployed a pervasive, highly distributed cloud optimisation platform with nearly 100,000 servers located in more than 650 cities, in 72 countries and within many of the key networks that make up the internet.

With more than 2,100 employees, annual revenue in 2010 was $1,023.6 million, up 19 percent from the previous year. Follow postings related to Akamai Technologies on tech4tea.com.

Akamai’s product portfolio includes:

Digital Asset Solutions – to manage, store and deliver exceptional digital media experiences across a wide range of platforms/devices around the world. These include Akaimai HD Network, Akamai Media Delivery and Electronic Software Delivery.

Dynamic Site Solutions – to spped up rich interactive content and accelerate all customer-driven transactions. These include Dynamic Site Accelerator and Dynamic Site Accelerator Enterprise.

Application Performance Solutions – enables the public Internet to be the platform for accelerating any critical enterprise application to anyone, anytime, anywhere. Akamai is the leading global service provider for accelerating content and business processes online.These include Web Application Accelerator and IP Application Accelerator.

Advertising Decision Solutions – to enable more relevant online advertising, using Predictive Segments.

Notorious serial killer Ted Bundy’s DNA to be added to national database

Tuesday, August 2nd, 2011

22 years after his execution in 1989, a vial of Ted Bundy’s blood has been found in Florida that yielded a full DNA profile of the evil serial killer. The profile will be uploaded to the FBI’s national database on Friday. Investigators can use the newly data to try to solve cases that went cold decades ago.

Mug shot, 1980, taken the day after sentencing for the murder of Kimberly Leach

Mug shot, 1980, taken the day after sentencing for the murder of Kimberly Leach

The vial of blood was part of evidence taken in 1978 when Bundy was arrested for the murder of a 12-year-old girl. A tissue sample taken before he was executed and cremated had produced only a partial profile.

Bundy’s murders and execution predated the creation of state and national databases that tracks millions of DNA samples of convicted offenders. Law enforcement agencies tap the databases to tackle unsolved crimes, build up their evidence against suspects or to clear them.

Bundy had confessed to more than 30 murders before he was executed but he was suspected of many more. One such case was the abduction of 8-year-old Ann Marie Burr, who disappeared from her Tacoma, Washington, home in 1961.

In custody, Florida, July 27, 1978

In custody, Florida, July 27, 1978

There has been much speculation and debate over whether she was one of his victims, even though he’d written to her parents before his execution saying he didn’t know what happened to the girl and denying having anything to do with her disappearance.

Fans of slasher and serial killer movies and TV drama will be familiar with the notorious Ted Bundy. He seems the most quoted serial killer, probably because of how prolific he was and how his charismatic good looks contrast with his heinous crimes.

Unlike the profilers in the TV drama series, “Criminal Minds”, that seems to eschew forensic evidence, and frequently treat the crime scene with scant regard to forensic procedures, the DNA profile will hopefully help investigators resolve cold cases and bring closure to families of those victims.

The account below is extracted from Wikipedia’s entry about Ted Bundy. Photos used in this posting were also from Wikipedia.

1975 Utah mug shot

1975 Utah mug shot

Theodore Robert “Ted” Bundy was born Theodore Robert Cowell in 1946 – January 24 and executed at age 43. He was a serial killer, rapist, kidnapper, and necrophile who assaulted and murdered many young women between 1974 and 1978. After more than a decade of denials he confessed to 30 homicides shortly before his execution.

In court in Florida

In court in Florida

Bundy was handsome and charismatic, traits he exploited in winning the confidence of his young, attractive female victims. He typically approached them in public places and feigned injury or disability, or impersonated an authority figure, before overpowering and assaulting them at a more secluded location.

Press conference in July 1978

Press conference in July 1978

He sometimes revisited his secondary crime scenes for hours at a time, grooming and performing sexual acts with the decomposing corpses until putrefaction and destruction by wild animals made further interaction impossible. He decapitated at least four victims and kept the severed heads in his apartment for a period of time as mementos. On a few occasions he simply broke into dwellings in the dead of night and bludgeoned victims as they slept.

Initially charged in Utah in 1975 and convicted of aggravated kidnapping and attempted criminal assault, Bundy became linked to a progressively longer list of unsolved homicides in six states. Facing murder charges in Colorado, he engineered two dramatic escapes, and committed at least three additional murders and several other violent assaults in Florida before his ultimate recapture in 1977. He received three death sentences in two separate trials for the three known Florida homicides.

Akamai Launches Global NetAlliance Partner Program

Tuesday, July 19th, 2011

The company helps systems integrators, application and infrastructure service providers, and technology partners to address the challenges of doing business in the cloud.

Information about the program and how to joinAkamai has announced its NetAlliance Partner Program for global IT solution providers. This program will help partners to integrate Akamai services into their offerings, and to pass the value of Akamai’s globally-distributed platform on to customers. With this program, Akamai is recruiting partners across platforms to help its customers leverage the Cloud for business success.

The Akamai NetAlliance partner program is designed to enable new members to:

“As the demands on today’s enterprise rapidly evolve, IT decision makers must find the optimal solutions for cloud computing, security, mobile strategies, and site and application performance,” said Brad Rinklin, vice president of Global Marketing & Alliances for Akamai.
  • Leverage integrated solutions with leaders in the public, private and hybrid-cloud markets
  • Add and scale revenue streams to their business with Akamai solutions that have a low barrier-to-entry
  • Help customers reduce costs and improve efficiencies in their datacenter consolidation by off-loading traffic to Akamai
  • Integrate a secure Internet platform to their cloud offerings by providing unmatched quality and performance
  • Increase global reach through Akamai’s global Internet platform
  • Access robust market development funds and Akamai training/certification programs
“Our focus on creating a diverse channel has allowed us to bring customers best-of-breed solutions for leveraging cloud business models without compromising security, reliability or performance. The Akamai NetAlliance Partner Program is about creating an eco-system of proven technology leaders in all regions of the world who are committed to advancing how business is conducted online,” Rinklin added.

Having built global alliances with IT leaders including HP, Jive Software, Rackspace® Hosting, Telefónica and Verizon, Akamai has also established partnerships with leading regional technology providers and integrators around the world to collaborate on solutions for today’s enterprise.

Some of those existing alliances across Asia Pacific Japan; Europe, Middle East and Africa; and South America, include 77 Agency, Arturai, Boreus, DMX Technologies Sdn. Bhd, Exceda, Kinmax Technology Inc., and Mikotek International Corporation, among others.

“2011 is a big year for extending Akamai’s channel as we made significant global investments to help our partners scale their business with our NetAlliance Partner Program,” said Martin Häring, vice president of International Marketing and Channels at Akamai.  “Our partners leverage Akamai’s innovative services to overcome inefficiencies inherent across the Internet. In collaboration with NetAlliance partners, Akamai is transforming the Internet into a reliable, high-performance platform for cloud services or any other Internet-based enterprise application.”

Company byte: CA Technologies

Sunday, July 10th, 2011

CA Technologies (NASDAQ: CA) is an IT management software and solutions company with expertise across all IT environments – from mainframe and distributed, to virtual and cloud.

CA Technologies is an IT management software and solutions company.CA Technologies manages and secures IT environments and enables customers to deliver more flexible IT services. CA Technologies’ products and services provide the insight and control essential for IT organizations to power business agility.

The majority of the Global Fortune 500 relies on CA Technologies to manage evolving IT ecosystems.

Follow CA Technologies on:

Behind-the-scenes story about Microsoft Kinect for Xbox 360

Saturday, July 9th, 2011

Dr Jamie Shotton had joined the Machine Learning & Perception group at Microsoft Research Cambridge (MSRC) in June 2008 as a post-doc for a few months when he was roped in by the Xbox product group to help launch the product by Christmas 2010.

He shared the experience with 4th year undergraduate Engineering students at the University of Cambridge Engineering Department earlier this year.

The body was divided into 31 different body parts to be recognised and reconstituted into a human pose.

The body was divided into 31 different body parts to be recognised and reconstituted into a human pose.

I was browsing through the university’s newsletter last week when I came upon this interesting story about some of the developmental challenges of the Microsoft Kinect for Xbox 360 and how they were surmounted. You can read the full original article here. Images used in this posting are from the original article.

The Kinect for Xbox 360 is a motion sensing input device for the Xbox 360 game console. Based around a webcam-style add-on accessory for the Xbox 360 console, it allows users to control and interact with the Xbox 360 without the need to touch or hold a game controller such as a joystick – depending instead on bodily gestures and spoken commands.

Dr Jamie Shotton from the Cambridge research laboratory in the UK

Dr Jamie Shotton from the Cambridge research laboratory in the UK.

Shotton now works for Microsoft at their Cambridge research laboratory in the UK. He had completed his PhD research in computer vision from 2003 to 2007. His initial research at the MSRC was on automatic visual object recognition – teaching computers how to recognise different types of objects in photographs such as cars, sheep and trees.

“Little did I know at that point how quickly I would get pulled into the frenzy of research and development around Kinect, and how this blue-skies research could be applied to such a practical problem,” Shotton recalled.

Enabling tools

At the point that Shotton was invited, Microsoft had already developed a few enabling tools.

Shotton's research into automatic visual object recognition trained computers to recognise different objects in photographs.
Shotton's research into automatic visual object recognition trained computers to recognise different objects in photographs.
Shotton's research into automatic visual object recognition trained computers to recognise different objects in photographs.
Shotton's research into automatic visual object recognition trained computers to recognise different objects in photographs.

Shotton's research into automatic visual object recognition trained computers to recognise different objects in photographs.

Depth-sensing camera. The new Kinect camera worked at 320×240 pixels and 30 frames per second versus other depth cameras at very low resolutions of 10×10 pixels.  “You could even make out the nose and eyes on your face,” “Shotton observed. The better depth accuracy helped with human pose estimation by eliminating objects in the background since they were further away. The colour and texture of clothing, skin and hair could also be normalised away. The depth camera was “active”, illuminating the subject with its own structured dot pattern of infra-red light so that the camera worked even in the dark.

Prototype human tracking algorithm.  The algorithm constantly compares its predictions of the body’s movements with the actual movements and then makes adjustments to improve the accuracy of its predictions.

Showstoppers

The tracking algorithm suffered from three limitations. First, the subject had to stand in a T-pose for the algorithm to lock it in initially. Second, if the subject moved too erratically and therefore unpredictably, the algorithm would lose track and would not be able to recover until the subject returned to the T-pose for recalibration. This could happen as often as every 5-10 seconds. Finally, the algorithm only worked with the limited number of body sizes and shapes that it had been trained with. Shotton’s mission was to overcome these showstoppers.

Overcoming the limitations

To allow the algorithm to recognise a subject and its posture without having to start from a T-pose, Shotton leveraged a fellow researcher’s (Dr Stenger) technique called “chamfer matching”: the subject’s image was compared with a training database of body images and once the closest match was selected, the 3D data for that match could then be utilised as the human pose for the subject.

However, there was an astronomical number of human poses based on the different combinations of position and orientation of body parts such as the arms, legs, knees and ankles. Shotton divided up the body into 31 parts so that each of the parts could be matched independently before building up the skeleton and body pose from the position of these parts. This was where Shotton’s PhD work on object recognition came in handy.

Although this substantially reduced the size of the image database needed to train the algorithm, the training database was still huge. The team had recorded hours of footage at a motion capture studio with several actors doing “gaming” moves such as dancing, running, fighting and driving.

The millions of training images would have taken months to train the algorithm. The team got help from colleagues at Microsoft Research in Silicon Valley who had developed an engine called “Dryad” for efficient and reliable distributed computation. Using a cluster of 100 powerful computers, the training time was reduced to less than a day.

Read the details of Shotton’s experience in the full original article here.