Tag: big

Healthcare Dives Into Big Data #healthcare #big #data #analytics

#

Healthcare Dives Into Big Data

With the mandated adoption of electronic health records (EHRs), many healthcare professionals for the first time got centralized access to patient records. Now they’re figuring out how to use all this information. Although the healthcare industry has been slow to delve into big data, that might be about to change. At stake: not only money saved from more efficient use of information, but also new research and treatments — and that’s just the beginning.

For instance, data from wireless, wearable devices such as FitBits is expected to eventually flood providers and insurers; by 2019, spending on wearables-data collection will reach $52 million, according to ABI Research. Another source of health data waiting to be analyzed: social media. Monitoring what people post can help fight insurance fraud and improve customer service.

These are just two ways big data can be used to improve care while cutting costs, experts say.

“We, as a society, need to start creating our own metrics for how healthcare quality is defined. In the sense of looking at costs, we know where there’s avoidable cost in healthcare. We just need to get folks the data they need to avoid those pitfalls,” said Dr. Anil Jain, senior VP and chief medical officer at Explorys. in an interview. Explorys, which is an innovation spinoff from Cleveland Clinic, is powering Accenture’s Predictive Health Intelligence in a collaboration intended to help life sciences companies determine the combination of treatments and services that can lead to better patient, provider, and economic outcomes for diabetics.

Hosted analytics, partnerships and collaborations, and lower-cost internal applications open the door for smaller organizations to use big data, too.

“Earlier, data warehousing and analytics was restricted to larger organizations because it was cost prohibitive. What big data has done has brought it down to smaller orgs. But the biggest challenge with these smaller markets and mid-tier organizations is resources,” Manmeet Singh, co-founder and CEO of Dataguise. told us. “Cloud is becoming very prevalent. They’re going to store a lot of data in the cloud. They’ll outsource a lot of that data to the cloud. Automation of compliance is important.”

Having witnessed the impact that big data and analytics have on other markets — and perhaps on competing healthcare organizations — healthcare CEOs want to know how their organizations can use these tools. In a PwC study, 95% of healthcare CEOs said they were exploring better ways to harness and manage big data.

Increasingly, CIOs can find similar organizations with pilot or full-blown projects. Forest Laboratories, for example, is collaborating with ConvergeHealth by Deloitte and Intermountain Healthcare on research to benefit patients with respiratory diseases. Using the collaborative, rapid-learning system developed by Intermountain and ConvergeHealth. Forest’s researchers use OutcomesMiner analytics software to develop new treatments and therapeutic products and improve patient outcomes.

The move to value-based payments means healthcare providers are taking on more risk, says Jeff Elton, managing director of Life Sciences for Accenture. To manage risk and treat patients most appropriately, providers need data — accurate data from a range of sources, he tells us.

Expanding use of big data across healthcare organizations should sound some alarms within C-level suites, Singh cautions. “From my perspective, security and compliance should be discussed from the get go. It should be part of their overall strategy.”

In the meantime, some healthcare organizations already have plunged into big-data analytics, with impressive results. Click through our slideshow to see some innovative uses of analytics in healthcare.

How are you using big data in healthcare projects? Let us know in the comments section.

Alison Diana has written about technology and business for more than 20 years. She was editor, contributors, at Internet Evolution; editor-in-chief of 21st Century IT; and managing editor, sections, at CRN. She has also written for eWeek, Baseline Magazine, Redmond Channel. View Full Bio





Tags : , , ,

4 Hot Open Source Big Data Projects #open #source, #big #data, #hadoop, #cloudera, #hortonworks

#

4 Hot Open Source Big Data Projects

There’s more — much more — to the Big Data software ecosystem than Hadoop. Here are four open source projects that will help you get big benefits from Big Data.

It’s difficult to talk about Big Data processing without mentioning Apache Hadoop. the open source Big Data software platform. But Hadoop is only part of the Big Data software ecosystem. There are many other open source software projects that are emerging to help you get more from Big Data.

Here are a few interesting ones that are worth keeping an eye on.

Spark

Spark bills itself as providing “lightning-fast cluster computing” that makes data analytics fast to run and fast to write. It’s being developed at UC Berkeley AMPLab and is free to download and use under the open source BSD license.

So what does it do? Essentially it’s an extremely fast cluster computing system that can run data in memory. It was designed for two applications where keeping data in memory is an advantage: running iterative machine learning algorithms, and interactive data mining.

It’s claimed that Spark can run up to 100 times faster than Hadoop MapReduce in these environments. Spark can access any data source that Hadoop can access, so you can run it on any existing data sets that you have already set up for a Hadoop environment.

Drill

Apache Drill is “a distributed system for interactive analysis of large-scale datasets.”

MapReduce is often used to perform batch analysis on Big Data in Hadoop, but what if batch processing isn’t suited to the task at hand: What if you want fast results to ad-hoc queries so you can carry out interactive data analysis and exploration?

Google developed its own solution to this problem for internal use with Dremel. and you can access Dremel as a service using Google’s BigQuery .

However if you don’t want to use Google’s Dremel on a software-as-a-service basis, Apache is backing Drill as an Incubation project. It’s based on Dremel, and its design goal is to scale to 10,000 servers or more and to be able to process petabytes of data and trillions of records in seconds.

D3.js

D3 stands for Data Driven Documents, and D3.js is an open source JavaScript library which allows you to manipulate documents that display Big Data. It was developed by New York Times graphics editor Michael Bostock.

Using D3.js you can create dynamic graphics using Web standards like HTML5, SVG and CSS. For example, you can generate a plain old HTML table from an array of numbers, but more impressively you can make an interactive bar chart using scalable vector graphic from the same data.

That barely scratches the surface of what D3 can do, however. There are dozens of visualization methods — like chord diagrams, bubble charts, node-link trees and dendograms — and thanks to D3’s open source nature, new ones are being contributed all the time.

D3 has been designed to be extremely fast, it supports Big Data datasets, and it has cross-hardware platform capability. That’s meant it has become an increasingly popular tool for showing graphical visualizations of the results of Big Data analysis. Expect to see more of it in the coming months.

HCatalog

HCatalog is an open source metadata and table management framework that works with Hadoop HDFS data, and which is distributed under the Apache license. It’s being developed by some of the engineers at Hortonworks, the commercial organization that’s the sponsor of Hadoop (and which also sponsors Apache).

The idea of HCatalog is to liberate Big Data by allowing different tools to share Hive. That means that Hadoop users making use of a tool like Pig or MapReduce or Hive have immediate access to data created with another tool, without any loading or transfer steps. Essentially it makes the Hive metastore available to users of other tools on Hadoop by providing connectors for Map Reduce and Pig. Users of those tools can read data from and write data to Hive s warehouse.

It also has a command line tool, so that users who do not use Hive can operate on the metastore with Hive Data Definition Language statements.

And More

Other open source big data projects to watch:

Storm. Storm makes it easy to reliably process unbounded streams of data, doing for real-time processing what Hadoop did for batch processing.

Kafka. Kafka is a messaging system that was originally developed at LinkedIn to serve as the foundation for LinkedIn’s activity stream and operational data processing pipeline. It is now used at a variety of different companies for various data pipeline and messaging uses.

Julia. Julia is a high-level, high-performance dynamic programming language for technical computing, with syntax that is familiar to users of other technical computing environments.

Impala. Cloudera Impala is a distributed query execution engine that runs against data stored natively in Apache HDFS and Apache HBase.

Paul Rubens has been covering IT security for over 20 years. In that time he has written for leading UK and international publications including The Economist, The Times, Financial Times, the BBC, Computing and ServerWatch.





Tags : , , , , , ,

6 Big Data Analytics Use Cases for Healthcare IT #healthcare #big #data #analytics

#

6 Big Data Analytics Use Cases for Healthcare IT

BOSTON—The increasing digitization of healthcare data means that organizations often add terabytes’ worth of patient records to data centers annually.

At the moment, much of that unstructured data sits unused, having been retained largely (if not solely) for regulatory purposes. However, as speakers at the inaugural Medical Informatics World conference suggest, a little bit of data analytics know-how can go a long way.

It isn’t easy, namely because the demand for healthcare IT skills far outpaces the supply of workers able to fill job openings, but a better grasp of that data means knowing more about individual patients as well as large groups of them and knowing how to use that information to provide better, more efficient and less expensive care.

Here are six real-world examples of how healthcare can use big data analytics.

1. Ditch the Cookbook, Move to Evidence-Based Medicine

Cookbook medicine refers to the practice of applying the same battery of tests to all patients who come into the emergency department with similar symptoms. This is efficient, but it’s rarely effective. As Dr. Leana Wan, an ED physician and co-author of When Doctors Don’t Listen. puts it, “Having our patient be ‘ruled out’ for a heart attack while he has gallstone pain doesn’t help anyone.”

Dr. John Halamka. CIO at Boston’s Beth Israel Deaconess Medical Center, says access to patient data—even from competing institutions—helps caregivers take an evidence-based approach to medicine. To that end, Beth Israel is rolling out a smartphone app that uses a Web-based- drag-and-drop UI to give caregivers self-service access to 200 million data points about 2 million patients.

Admittedly, the health information exchange process necessary for getting that patient data isn’t easy, Halamka says. Even when data’s in hand, analytics can be complicated; what one electronic health record (EHR) system calls “high blood pressure” a second may call “elevated blood pressure” and a third “hypertension.” To combat this, Beth Israel is encoding physician notes using the SNOMED CT standard. In addition to the benefit of standardization, using SNOMED CT makes data more searchable, which aids the research query process.

2. Give Everyone a Chance to Participate

The practice of medicine cannot succeed without research, but the research process itself is flawed, says Leonard D’Avolio, associate center director of biomedical informatics for MAVERIC within the U.S. Department of Veterans Affairs. Randomized controlled trials can last many years and cost millions of dollars, he says, while observational studies can suffer from inherent bias.

The VA’s remedy has been the Million Veteran Program. a voluntary research program that’s using blood samples and other health information from U.S. military veterans to study how genes affect one’s health. So far, more than 150,000 veterans have enrolled, D’Avolio says.

All data is available to the VA’s 3,300 researchers and its hospital academic affiliates. The idea, he says, is to embed the clinical trial within VistA, the VA EHR system, with the data then used to augment clinical decision support.

3. Build Apps That Make EHR ‘Smart’

A data warehouse is great, says John D’Amore, founder of clinical analytics software vendor Clinfometrics. but it’s the healthcare equivalent of a battleship that’s big and powerful but comes with a hefty price tag and isn’t suitable for many types of battles. It’s better to use lightweight drones—in this case, applications—which are easy to build in order to accomplish a specific task.

To accomplish this, you’ll need records that adhere to the Continuity of Care Document (CCD) standard. A certified EHR must be able to generate a CCD file, and this is often done in the form of a patient care summary. In addition, D’Amore says, you’ll need to use SNOMED CT as well as LOINC to standardize your terminology.

Echoing Halamka, co-presenter Dean Sittig. professor in the School of Biomedical Informatics at the University of Texas Health Science Center at Houston, acknowledges that this isn’t easy. Stage 1 of meaningful use. the government incentive program that encourages EHR use, only makes the testing of care summary exchange optional, and at the moment fewer than 25 percent of hospitals are doing so.

The inability or EHR, health and wellness apps to communicate among themselves is a “significant limitation,” Sittig says. This is something providers will learn the hard way when stage 2 of meaningful use begins in 2014, D’Amore adds.

That said, the data that’s available in CCD files can be put to use in several ways, D’Amore says, ranging from predictive analytics that can reduce hospital readmissions to data mining rules that look at patient charts from previous visits to fill gaps in current charts. The latter scenario has been proven to nearly double the number of problems that get documented in the patient record, he adds.

4. ‘Domesticate’ Data for Better Public Health Reporting, Research

Stage 2 of meaningful use requires organizations to submit syndromic surveillance data, immunization registries and other information to public health agencies. This, says Brian Dixon, assistant professor of health informatics at Indiana University and research scientist with the Regenstrief Institute. offers a great opportunity to “normalize” raw patient data by mapping it to LOINC and SNOMED CT, as well as by performing real-time natural language processing and using tools such as the Notifiable Condition Detector to determine which conditions are worth reporting.

Dixon compares this process to the Neolithic Revolution that refers to the shift from hunter-gatherer to agrarian society approximately 12,000 years ago. Healthcare organizations no longer need to hunt for and gather data; now, he says, the challenge is to domesticate and tame the data for an informaticist’s provision and control.

The benefits of this process—in addition to meeting regulatory requirements—include research that takes into account demographic information as well as corollary tests related to specific treatments. This eliminates gaps in records that public health agencies often must fill with phone calls to already burdened healthcare organizations, Dixon notes. In return, the community data that physicians receive from public health agencies will be robust enough to offer what Dixon dubs “population health decision support.”

5. Make Healthcare IT Vendors Articulate SOA Strategy

Dr. Mark Dente, managing director and chief medical officer for MBS Services, recommends that healthcare organizations “aggregate clinical data at whatever level you can afford to do it,” then normalize that data (as others explain above). This capability to normalize data sets in part explains the growth and success of providers such as Kaiser Permanente and Intermountain Healthcare, he says.

To do this, you need to create modules and apps such as the ones D’Amore describes. This often requires linking contemporary data sets to legacy IT architecture. The MUMPS programming language, originally designed in 1966, has served healthcare’s data processing needs well, but data extraction is difficult, Dente says.

Service oriented architecture is the answer, Dente says, because it can be built to host today’s data sets—as well as tomorrow’s, from sources that organizations don’t even know they need yet. (This could range from personal medical devices to a patient’s grocery store rewards card.) Challenge vendors on their SOA strategy, Dente says, and be wary of those who don’t have one.

6. Use Free Public Health Data For Informed Strategic Planning

Strategic plans for healthcare organizations often resort to reactive responses to the competitive market and a “built it and they will come” mentality, says Les Jebson, director of the Diabetes Center of Excellence within the University of Florida Academic Health System. Taking a more proactive approach requires little more than a some programming know-how.

Using Google Maps and free public health data, the University of Florida created heat maps for municipalities based on numerous factors, from population growth to chronic disease rates, and compared those factors to the availability of medical services in those areas. When merged with internal data, strategic planning becomes both visually compelling (critical for C-level executives) and objective (critical for population health management), Jebson says.

With this mapping, for example, the university found three Florida counties that were underserved for breast cancer screening and thus redirected its mobile care units accordingly.





Tags : , , ,

Bond Market’s Big Illusion Revealed as U #free #business #website

#bond market news

#

Bond Market’s Big Illusion Revealed as U.S. Yields Turn Negative

For Kaoru Sekiai, getting steady returns for his pension clients in Japan used to be simple: buy U.S. Treasuries.

Compared with his low-risk options at home, like Japanese government bonds, Treasuries have long offered the highest yields around. And that’s been the case even after accounting for the cost to hedge against the dollar’s ups and downs — a common practice for institutions that invest internationally.

It’s been a “no-brainer since forever,” said Sekiai, a money manager at Tokyo-based DIAM Co. which oversees about $166 billion.

That truism is now a thing of the past. Last month, yields on U.S. 10-year notes turned negative for Japanese buyers who pay to eliminate currency fluctuations from their returns, something that hasn’t happened since the financial crisis. It’s even worse for euro-based investors, who are locking in sub-zero returns on Treasuries for the first time in history.

For a detailed description of how this index was created, click here.

For an analysis of hedging costs for Japanese investors, click here.

That quirk means the longstanding notion of the U.S. as a respite from negative yields in Japan and Europe is little more than an illusion. With everyone from Jeffrey Gundlach to Bill Gross warning of a bubble in bonds, it could ultimately upend the record foreign demand for Treasuries, which has underpinned their seemingly unstoppable gains in recent years.

“People like a simple narrative,” said Jeffrey Rosenberg, the chief investment strategist for fixed income at BlackRock Inc. which oversees $4.6 trillion. “But there isn’t a free lunch. You can’t simply talk about yield differentials without talking about currency differentials.”

DIAM’s Sekiai has been shunning Treasuries since April, a month after foreign holdings of U.S. debt hit a record. Instead, he favors bonds of France and Italy because they “offer some degree of yield and the currency-hedging costs are cheap.” That shift lines up with the latest available Treasury Department data, which showed that demand from non-U.S. investors in April and May was the weakest in a two-month stretch since 2013.

The fact that yields on 10-year Treasuries are still way higher than those in Japan or Germany is part of the reason foreigners are having such a hard time actually profiting from the difference. Negative interest rates outside the U.S. have caused a surge in demand for dollars and dollar assets, pushing up the cost to get into and out of the greenback at the same exchange rate to levels rarely seen in the past.

Ten-year yields in the U.S. are currently about 0.23 percentage point below a basket of bonds from Australia, France, Germany, Italy, Japan, Spain and Switzerland on a hedged basis, versus 1.4 percentage points above on an unhedged basis, according to data compiled by BlackRock. At the start of the year, hedged Treasuries yielded over a half-percentage point more.

In Japan, where 10-year government bonds yield less than zero, the advantage for Treasuries has dwindled from a percentage point at the start of the year to less than 0.1 percentage point now. Without much added value for overseas investors, it’s harder to see foreign demand driving Treasuries to new records, especially as the Federal Reserve moves toward gradually raising rates.

Since falling to a record 1.318 percent on July 6, yields on 10-year notes have backed up as a string of economic reports such as last week’s jobs data bolstered the case for higher rates. They were at 1.58 percent today.

For a large swathe of institutional investors, especially those with conservative mandates, hedging is the norm when they go abroad. It eliminates the need to worry about the daily ebbs and flows in exchange rates and how that might affect their returns. When it comes to Treasuries, overseas buyers usually lock in a fixed exchange rate on the interest payments they get in dollars.

Conversion Costs

In that trade, the cost to convert payments from one currency to another is determined by the cross-currency basis swap. Take Japanese insurers as an example. Under normal circumstances, they would swap their yen for dollars and get interest on the yen they loaned out over the course of the contract.

But now, because the rate has turned negative, they’re effectively paying interest to lend the yen, which eats into their bond returns. That’s on top of the Libor rate they’ll need to pay for borrowing the dollars, which currently stands at 0.79 percent over three months.

The basis, as it’s known, was at minus 0.6425 percentage point for yen-based investors, which is close to the most expensive in five years. For those with euros, the basis is minus 0.43 percentage point. That’s more than twice as costly as the average over the past three years.

In a perfectly efficient market, none of this would matter. Differences in interest rates would be perfectly offset by the cost of exchanging two different currencies over time. But in the real world, things are far messier.

As unconventional monetary policies in Japan and Europe pushed yields lower and lower in recent years, demand for dollars has soared in tandem with the currency’s appreciation. Banks responded by demanding stiffer terms to swap into dollars as supply diminished, cutting into profits on the “carry trade” in Treasuries.

Treasuries will remain a better alternative for many overseas investors as long as an advantage exists, no matter how small the hedged yield pickup has become, according to Ralph Axel, a bond analyst at Bank of America Corp.

“They’ll just keep buying,” Axel said. Because of forces like negative rates and quantitative easing outside the U.S. “you clearly have a long-lasting bid.”

Of course, there’s the flip side. The overwhelming demand for U.S. currency is proving to be a boon for American investors and foreign central banks sitting on billions of dollars. Pacific Investment Management Co. also says there’s profit to be made by getting paid to swap dollars into yen.

Interest-Rate Swaps

Overseas money managers, though, have had to turn to more novel solutions to avoid the onerous hedging costs. Jack Loudoun, who helps oversee about $88 billion for Vontobel Asset Management in Zurich, says he prefers interest-rate swaps and futures on Treasuries to get exposure to the U.S. market because lower upfront costs help reduce foreign-exchange risk.

“We’re using derivatives to get access,” he said. “If you’re worried about hedging cost, swaps and futures are the avenues to go down.”

Whatever the strategy, there’s little debate over how important foreign demand is for the $13.4 trillion market for Treasuries.

“We’re at a point now where investors have to start thinking about this,” said Sachin Gupta, a foreign-bond fund manager at Pimco, which oversees $1.51 trillion. “As the cost of hedging rises to such an extent, there’s no extra carry to be had. That itself will slow down the demand — and, at some point, even reverse the demand — for Treasuries.”

Before it’s here, it’s on the Bloomberg Terminal. LEARN MORE





Tags : , , , , , ,

10 big investment ideas for 2016 #new #business #opportunities

#investment ideas

#

10 big investment ideas for 2016

  • Share on twitter

It’s time to fire up the interneuronal connections and carve out 10 big ideas for 2016.

Asian nation

My first offering is that Australia will likely become an Asian nation in its ethnic orientation. Apologies to the xenophobes, but it’s happening under your nose. An incredible 28 per cent of Australia’s population (or 6.6 million people) were born overseas – the highest in 120 years. During the last census a remarkable 12 per cent of Australians said they had Asian ancestry.

In Sydney and Melbourne, 19 per cent and 18 per cent, respectively, of residents are Asian. In Sydney regions like Parramatta and Ryde, the Asian share of the population is as high as 34 per cent and 33 per cent, respectively. China and India have overtaken the UK as Australia’s biggest source of new migrants, collectively accounting for 35 per cent of the intake in 2013-14.

The idea of Australia stealthily yet ineluctably becoming an Asian nation is a big deal: it will reinforce our unique antipodal trading position and powerful role as a politically stable economic conduit between east and west; it will help improve our cultural commonalities with major regional actors like China, India and Indonesia (mitigating geopolitical hazards); and it should serve as a source of innovation, productivity and growth, just as the influx of ambitious European migrants did after World War II.

Related Quotes

Bank returns on equity will fall

Idea number two is that the major banks’ returns on equity (RoEs) are inevitably going to fall from around 15 per cent towards their 11 per cent cost of equity as result of the banking system becoming a highly competitive and level playing field. While this process may take five years or more, it should mean that rather than trading at an unusually high two times book value, the majors will price at circa one times. If I’m right, there is much downside to current valuations, which is a proposition reinforced by analysts’ crazy forecasts that bad and doubtful debt charges will stay around 30-year lows.

In five years the majors will have ceded the competitive advantages that fuelled their world-beating RoEs. Rather than carrying 25 per cent more leverage than rivals, they will end up having less leverage and more equity capital in the funding mix. Combined with the fact that smaller banks tend not to source as much funding in the dearer wholesale bond markets – underwriting assets with cheaper deposits that are now a government-guaranteed (and more stable) funding source – I believe the majors will wind up having more expensive funding costs. In short, we will migrate to a system where the majors are much safer banks with reduced risks of failure, with the trade-off of lower returns on equity than competitors that have loftier leverage and lower funding costs. There should, therefore, be an economic role reversal between the big four and their rivals.

Another Macquarie Bank?

If the majors are going to become slow-moving, yet bullet-proof, utilities, a third idea is investors should look for superior returns from more fleet-footed alternatives that are not saddled with the financial baggage of being too-big-to-fail. One day we will eventually see another Nicholas Moore who creates a new Macquarie Bank with a much skinnier 50 per cent dividend payout ratio (compared to the majors’ 80 per cent pay-out policies) that retains earnings to support investments in innovative and entrepreneurial opportunities. Macquarie has done a fabulous job of continuously reinventing itself to maintain growth and studiously avoided allocating too much capital to competing in the majors’ commodity markets.

On this note, the majors will likely lose significant market share in home loans to regional banks that for the first time will be able to compete effectively with them on price without crushing their returns. Rather than being price setters, the majors will become price takers and have to give back the recent rate hikes they have foisted on borrowers to compensate for expanding equity funding costs or suffer market share losses. This will compel them back into the less contested business lending space, which will lubricate credit to companies. Indeed, I think the majors’ balance-sheet splits between residential and business loans will revert back to the 40:60 levels before the 1991 recession.

Our forecasts for double-digit house price growth in 2013 and 2014, and high single-digit growth in 2015, were spot on. My fourth idea is that there will be no imminent housing collapse, and the price of our bricks and mortar will again climb in 2016, albeit at a much slower pace of around 1 to 2 times income growth. I maintain the view that the market is very expensive (15 to 25 per cent above fair value) and recently sold my own home. The interest rate hikes that will be the catalyst for a sustained Aussie housing correction appear to have been shunted into the distant future.

A fifth idea is that as the US and UK jobless rates (5 and 5.3 per cent respectively) fall towards 3 per cent in 2016 and 2017, wage and consumer price inflation will gradually reanimate. While the Fed will hike in December, central banks will get behind the curve because of their desire to “look through” this reflation.

Fixed-rate bond prices to plummet

This prompts another idea, which is that fixed-rate bond prices will melt as long-term yields rise on the back of financial markets resisting the Fed’s dovish view of the world and acknowledging stubbornly strong inflation data. The existential moment for global central banks will arrive when the break-even inflation rates priced by the bond market begin breaching official inflation targets in a sign that investors no longer think that monetary policy (and so-called nominal growth targeting) is compatible with price stability. Asset allocators need to be short interest rate duration or, if you have to be exposed to this risk, hire a smart duration manager – they can be hard to find. Few people can consistently call rate changes right.

If this base case plays out, my seventh thought is that global equities will face tremendous headwinds as long-term risk-free rates (that is, government bond yields) mean-revert back to some semblance of normality, which means yields 50 per cent to 100 per cent higher than current marks. Recall that the 10-year government bond yield is an essential input as the underpinning for the discount rates in the valuation models for all listed and unlisted equity and real estate markets.

Sell ‘beta’ buy ‘alpha’

This insight furnishes an eighth idea, which is sell equity “beta” and buy “alpha”, as I advocated last year. Aussie shares (beta) have declined over the year to date while market-neutral and long-short hedge funds (alpha) have delivered terrific returns (at least the guys that I invest with have). This dynamic is unlikely to change.

In the more immediate term (over the next, say, one to two years), I like “spread” assets as the search for yield will remain a critical influence over investor behaviour as long as deposits do not offer any material “real” returns above inflation and equities continues to get hammered. One example is major bank subordinated bonds, which currently trade very cheaply on a global basis despite the majors being among the best capitalised banks globally, care of $33 billion of equity origination over the last 12 months. The credit ratings on major banks’ subordinated debt are on par with the senior bonds issued by Goldman Sachs, Morgan Stanley or Citigroup, and I think there is a decent chance they will get upgraded to the “A” band next year if Standard & Poor’s lifts the majors’ stand-alone credit profiles from “a” to “a+”, as it has signalled it may do.

A final thought is that if the world is once again forced to choose between elevated interest rates and high and volatile inflation, there is a possibility that the value of paper money will atrophy as a credible medium of exchange. This could precipitate a flight to safety in the form of a resurgence in the demand for gold as a hedge against the debasement of money by governments using the printing press to finance their own deficits.





Tags : , , , , ,

Big Data Event London Conference – Exhibition #big #data #analytics #for #financial #services

#

REGISTER YOUR INTEREST BELOW

KNOWLEDGE IS POWER. ARM YOURSELF TO EXCEL IN THE BRAVE NEW DATA-DRIVEN WORLD.

Big Data London will host leading, global data and analytics experts, ready to arm you with the tools to deliver your most effective data-driven strategy.

Open to all and free to attend conference and exhibition. Discuss the big questions and share ideas with forward-thinking peers and leading members of the Data community. Be in the vanguard of the data revolution, sign up to Big Data London and learn how to build a bright future for your business.

Don’t Miss the UK’s Largest Data & Analytics Event

Stephen Russell

EMEA Practice Lead, Information Governance

After writing a thesis on ‘Managing Tacit Knowledge in Business’ for his Masters’ Degree at University of London in 2000, Steve has continued to work with enterprises across many disciplines and geographies to better manage risk and value in Information and Data. His career since then balances both large corporates and small consultancies. Steve remains passionate about the use of technology and teams to enable better business, recognizing that data is the currency which provides both value and risk to all businesses. For Veritas, Steve leads the EMEA Consultancy Advisory Practice which develops and delivers the go-to-market services around our technology solutions. Steve is based with his family near London, he has travelled and worked widely in EMEA and worldwide

Andy Meikle

Chief People Officer

Andy has been the Chief People Officer of JustGiving for 4 years. JustGiving is the world’s largest platform for giving, where people connect with causes they care about. Prior to that he worked in Stockholm as HR Director for Cint, a Swedish SaaS business. Andy comes from a background of psychology, holding a Ph.D in Experimental Psychology and having worked as an Intelligence Analyst for the Serious and Organised Crime Agency. Andy is passionate about developing company cultures and unlocking employee, team and organisational potential.

Shirley Wills

BI Business Analyst at Retail Assist

Shirley has worked in the IT and Finance industries for over 25 years. She is a Prince 2 Practitioner and certified Accountant with a wealth of experience in financial accounting/auditing, business intelligence, business risk/regulatory compliance and ERP/CRM. Shirley is currently BI Business Analyst at Retail Assist, supporting Karen Millen, COAST, Oasis and Warehouse.

Carl Wiper

Group Manager, Policy Delivery

Carl Wiper has worked at the Information Commissioner’s Office since 2010. He is currently a Group Manager in the Policy Delivery department. He is working on a number of policy issues in relation to data protection and freedom of information, including big data, profiling and outsourcing. He worked on the ICO’s discussion paper on Big data and data protection. Before joining the ICO, he was an information manager in local government, and his career has been spent working in information management and research in organisations in the public, private and third sectors.

Thomas Breach

Talent Acquisition Manager, Technology & Product

As Talent Acquisition Manager for Technology Product, Thomas Breach is supporting a significant digital transformation at Elsevier. The company is harnessing big data, semantic web and cloud technologies to enhance the performance of global science and health professionals. With a focus on talent identification for London and Amsterdam he supports a team in North America and Europe seeking to bring on board in the next year in excess of 300 new technologists.

One of a new breed of next-generation IT leaders, Omid Shiraji is currently the interim CIO at Camden Council, previously holding senior technology leadership roles in the private and third sectors as CIO for Working Links and Director of Service for City University London.

Paul Fisher is a Research Director with PAC UK. His research and analysis focuses on cyber security technologies and the market trends that affect the client and supplier communities. He works with global suppliers and the CIO and CISO communities at leading organisations, advising them on cyber strategies and market intelligence. Paul is an accomplished speaker and a regular host at industry events.

Guy Cohen

Strategic Relationships Manager

In addition to his work at Privitar developing research, policy and strategy, Guy is also a junior fellow at the University of Cambridge, Centre for Science and Policy. Before joining Privitar, Guy worked in various roles in the Civil Service; in the Cabinet Office, the Department of Health and HMRC.

Jeff Rothwell

Jeff Rothwell is a Sales Engineer at Cloudera, where he helps customers architect Big Data solutions that reveal insight from all types of data and all types of systems. He has a background in Business Intelligence and Big Data Analytics and over 20 years experience in the software industry across a diverse range of vertical markets.

Olivier is EMEA Solutions Lead at Trifacta. He has 7 years experience in analytics with prior roles as technical lead for business analytics at Splunk and as a quantitative analyst at Accenture and Aon.

Georges Gavelle, VP of Sales, EMEA has more than 21 years of software industry experience. He specialises in areas of Business Intelligence, Financial Services, Corporate Governance and Life Sciences. Prior to joining Datawatch, Georges has worked for IBM, Cognos and Siebel.

VP Professional Services, EMEA

Martin Hapl has made a career out of bringing new technologies to the market. After starting out at Systinet, a company that helped to establish the idea of Web Services, he led SOA practice for Mercury Interactive and HP Software. In his current role as VP of Services (EMEA region) for GoodData, he is focused on helping customers build, implement, and succeed with Data Products that change the way that they do business.

Laurie Maclachlan

Regional Sales Director NEMEA

Laurie Maclachlan is the Regional Sales Director for Northern Europe and has an extensive 18 year career in IT, gained initially at Big 5 Management Consultancies and more recently within disruptive high-tech companies. Laurie leads a team of 8 Enterprise Account Managers in the UK, Nordics and Benelux countries.

Steven is the COO of wejo, an innovator in connected car data and technologies. He runs all technology, data and analytics as well as the day-to-day company portfolio plans and operations. Prior to that Steven was CDO at online betting giant Betsson AB and previously at MoneySuperMarket as GM of Data.

Kostas Tzoumas is co-founder and CEO of data Artisans, the company founded by the original creators of Apache Flink. Kostas is PMC member of Apache Flink and earned a PhD in Computer Science from Aalborg University with postdoctoral experience at TU Berlin. He is author of a number of technical papers and blog articles on stream processing and other data science topics.

Ted Orme

EMEA VP of Technology

Ted is EMEA VP of Technology at Attunity and is responsible for Attunity’s technology alliances in EMEA including Cloudera, Hortonworks, HP, IBM, MapR, Microsoft and Oracle. Ted also plays a key role in shaping thought leadership at Attunity and presents regularly at leading industry events. Ted has been in the industry for over sixteen years and has a BSc in Economics from the University of Kent.

Ian Massingham

Chief Evangelist, EMEA

Ian Massingham is a Technical Evangelist at Amazon Web Services and has been working with cloud computing technologies since 2008. In his role works he works to increase the awareness of AWS cloud services and works with customers of all sizes, from start-ups to large enterprises, to help them benefit from the adoption of AWS.

With a background in software engineering and product management and a passion for the intersection between business and technology, Jeremy has been involved in the big data space, both in the utilities industry developing predictive models for customer activity and more recently financial services with Valo, a real-time streaming analytics engine.

Seminar Schedule

Opening Keynote: The largest Big Data project in the Universe

When Big Data becomes Super Data – The largest Big Data project in the Universe. Bojan will discuss his role in the Square Kilometre Array Telescope, which will ingest 1 Terabyte of data per second. You’ll learn how this team’s pioneering work is defining the future of Data Engineering.

by for SKA Telescope Project

4th November 2016

Opening Keynote: Data Culture

Brought to you by Jonathan Woodward of Microsoft and Gary Richardson of KPMG. How do you become a data-driven organisation? The answer surely lies in building a corporate data culture, breaking down silos and data fiefdoms, sweeping aside outdated thinking. Jonathan and Gary explain how Silicon Valley-style digital transformation is going mainstream and what this means for your use of data.

OUR FOUNDING PARTNERS

REGISTER HERE
FREE TODAY





Tags : , , , , ,

BIG RED DOG named to the Fast 50 by Austin Business Journal #online #home

#austin business journal

#

BIG RED DOG Named to the Fast 50 by Austin Business Journal

August 31, 2016 by Will Schnier P.E.

BIG RED DOG Engineering and Consulting was again named one of the 50 Fastest Growing Private Companies in central Texas by Austin Business Journal for the 2015 fiscal year. This award is a testament to our amazing clients and team members.

To qualify, companies must have experienced dramatic revenue growth during the past three years. Financial data is submitted by the companies, verified by a third party and then we rank the top 50 according to compounded revenue growth.

We were also honored with the same award in 2013 and 2014 .

Read more on the Austin Business Journal website .

Related Posts

  • BIG RED DOG Named to ABJ s Fast 50!
  • BIG RED DOG Enters MEP Engineering Business Acquires Johnson Consulting Engineers
  • 2014 San Antonio Business Journal Best in Commercial Real Estate Awards
  • Amy Hageman Wins SMPS Member of the Year
  • BIG RED DOG Celebrates our First 5 Years
  • Award Winning BIG RED Blog Award From www.civilengineeringschools.org

Written by Will Schnier P.E.

Will Schnier is the Chief Executive Officer of BIG RED DOG Engineering | Consulting. Will received his BSCE from Purdue University and co-founded BIG RED DOG Engineering and Consulting in 2009. Since starting the firm in 2009, BIG RED DOG has grown to over 100 team members with offices in Austin, Dallas, Houston, and San Antonio. BIG RED DOG has garnered awards for being one of the 50 fastest growing companies in Texas (Business Journal’s Fast 50 in 2012, 2013, 2014, 2015) and an ENR top 100 Design Firm in Texas and Louisiana (2012, 2103, 2014, 2015). Mr. Schnier is very well versed in the project review and development permitting process having worked closely and very successfully with City and County review staff, neighborhood associations, environmental groups, and public boards and councils. He has been responsible for the project management, engineering design, and regulatory permitting of hundreds of single family subdivision projects, mixed use and multifamily residential developments, industrial facilities and oil and gas development projects throughout Texas. He is the author of two publications: “Land Subdivision – A Practical Guide for Central Texas” and “The Book on License Agreements in the City of Austin”. Will was appointed to the Board of Directors of the Real Estate Council of Austin (RECA) in 2014 and served as Mayor Lee Leffingwell’s appointment to the City of Austin Zoning Board of Adjustment from 2011 to 2015.





Tags : , , , , , , , , , , ,

8 FREE Ways to Send Large Files Online – Friedbeef – s Tech #send

#

8 FREE Ways to Send Large Files Online

Gmail, Yahoo, Msn and other email providers do not allow users to send large files by email. The maximum limit in all of these email services is 20 MB which isn t a lot if you want to send movies, videos, or large files. The following is a list of free ways to send large files online.

1. How to Send Multiple Large Files at Once

File Factory lets you upload files of 300 MB size. The convenient thing about File Factory is that you can upload multiple files (up to 25) at once. You can upload files from your Computer or from a remote server. The premium account lets you upload files from FTP or using a torrent file.

2. How to Send Large Files Without Even Registering an Account

YouSendIt is another file sharing service that has become popular in recent times. It lets you upload and share files of sizes 100 MB each. The site contains an email form and you can send an email containing your file from the home page itself where your uploaded file is available for download for only 7 days

3. How to Send Large Files via Firefox Extension

Another file sharing service is Drop.io which lets you upload files of sizes 100 MB each. The files uploaded can be downloaded any number of times and you can also password protect your files to retain privacy. If you are a Firefox fan you can use the Drop.io Firefox add-on.

4. How to Send and Password Protect Large Files for Security

TransferBigFiles lets you send large files of size 1 GB each by email. The files can be password protected and you can add a personal note to the email you send to the recipients. When your friends download the file, you will receive email notifications which is a feature. You can send the uploaded file to multiple email recipients with a single click.

5. How to Send up to 5GB Files for Free

The service which allows the biggest file to be sent in the list is File Dropper. This site lets you send files as large as 5 GB in size. You can upload movies, videos, executables, photos and possibly any type of file with File Dropper. The only downside with File Dropper is that you cannot upload multiple files at once.

After the upload is complete, you are given a personal link of the uploaded file. Just copy that link and you can email the link to your friends so that they can download it. There are no sign ups or registrations involved with File dropper. However, signing up has its own benefits as you will be able to keep track of your files, add password protection, mark files as private and delete files later on.

6. How to Email Large Files Without Having to Upload Them Anywhere

Tonsho is an interesting twist to the service. Rather than uploading files yourself, you can use a normal email client such as Microsoft outlook and Thunderbird, and send your email normally. Your email will then pass through Tonsho servers and if the attachments are too large, it will automatically upload it to a server, and provide the recipient a link.

While this takes a lot of hassle away from the whole process, you have to be wary of email security. I m not sure I m comfortable passing my email through anyone else s server.

7. How to Send Large Files Using a Desktop Client

Dropsend is a free service which allows you to send files up to 2GB large. While most of these web-upload interfaces work well, for me, nothing beats a desktop client. It just feels more robust, and its certainly more comforting to use when you re spending a few hours sending a very large file.

8. How to Send Large Files and Tweet them from your Phone or Desktop

FileSocial is one of the easiest ways to share files on twitter. Just attach your file and a link will be posted to your file. Sharesend is another alternative for this, but I m wary about their non-usage of OAuth for twitter, so I d recommend Filesocial if you want to send and tweet securely.

This post was written by Abhishek Palit who also writes other useful tips at his blogSofttricks, and James Yeang. Do you want to write for Friedbeef s Tech ?

Which services do you prefer to send large files? Do let us know your views and suggestions through a comment.

This entry was posted on Friday, January 1st, 2010 at 3:41 pm and is filed under Productivity. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response. or trackback from your own site.

53 Responses to 8 FREE Ways to Send Large Files Online





Tags : , , ,

Big Bucks Bracket Racing #big #bucks #bracket #racing, #million #dollar #bracket #race, #spring #fling

#

Click on the banner for more information about the race.

LET THESE TRACK OWNERS KNOW YOU SAW IT ON DRAGRACERESULTS.COM

Site Design by the DragRaceResults.Com network
All material found on this site is 1998 – 2017.
No reproduction of any kind is permitted
without the expressed written permission of DragRaceResults.com





Tags : , , , , , , , , , , , , , , , , , , , , , ,

10 big investment ideas for 2016 #business #plan #model

#investment ideas

#

10 big investment ideas for 2016

  • Share on twitter

It’s time to fire up the interneuronal connections and carve out 10 big ideas for 2016.

Asian nation

My first offering is that Australia will likely become an Asian nation in its ethnic orientation. Apologies to the xenophobes, but it’s happening under your nose. An incredible 28 per cent of Australia’s population (or 6.6 million people) were born overseas – the highest in 120 years. During the last census a remarkable 12 per cent of Australians said they had Asian ancestry.

In Sydney and Melbourne, 19 per cent and 18 per cent, respectively, of residents are Asian. In Sydney regions like Parramatta and Ryde, the Asian share of the population is as high as 34 per cent and 33 per cent, respectively. China and India have overtaken the UK as Australia’s biggest source of new migrants, collectively accounting for 35 per cent of the intake in 2013-14.

The idea of Australia stealthily yet ineluctably becoming an Asian nation is a big deal: it will reinforce our unique antipodal trading position and powerful role as a politically stable economic conduit between east and west; it will help improve our cultural commonalities with major regional actors like China, India and Indonesia (mitigating geopolitical hazards); and it should serve as a source of innovation, productivity and growth, just as the influx of ambitious European migrants did after World War II.

Related Quotes

Bank returns on equity will fall

Idea number two is that the major banks’ returns on equity (RoEs) are inevitably going to fall from around 15 per cent towards their 11 per cent cost of equity as result of the banking system becoming a highly competitive and level playing field. While this process may take five years or more, it should mean that rather than trading at an unusually high two times book value, the majors will price at circa one times. If I’m right, there is much downside to current valuations, which is a proposition reinforced by analysts’ crazy forecasts that bad and doubtful debt charges will stay around 30-year lows.

In five years the majors will have ceded the competitive advantages that fuelled their world-beating RoEs. Rather than carrying 25 per cent more leverage than rivals, they will end up having less leverage and more equity capital in the funding mix. Combined with the fact that smaller banks tend not to source as much funding in the dearer wholesale bond markets – underwriting assets with cheaper deposits that are now a government-guaranteed (and more stable) funding source – I believe the majors will wind up having more expensive funding costs. In short, we will migrate to a system where the majors are much safer banks with reduced risks of failure, with the trade-off of lower returns on equity than competitors that have loftier leverage and lower funding costs. There should, therefore, be an economic role reversal between the big four and their rivals.

Another Macquarie Bank?

If the majors are going to become slow-moving, yet bullet-proof, utilities, a third idea is investors should look for superior returns from more fleet-footed alternatives that are not saddled with the financial baggage of being too-big-to-fail. One day we will eventually see another Nicholas Moore who creates a new Macquarie Bank with a much skinnier 50 per cent dividend payout ratio (compared to the majors’ 80 per cent pay-out policies) that retains earnings to support investments in innovative and entrepreneurial opportunities. Macquarie has done a fabulous job of continuously reinventing itself to maintain growth and studiously avoided allocating too much capital to competing in the majors’ commodity markets.

On this note, the majors will likely lose significant market share in home loans to regional banks that for the first time will be able to compete effectively with them on price without crushing their returns. Rather than being price setters, the majors will become price takers and have to give back the recent rate hikes they have foisted on borrowers to compensate for expanding equity funding costs or suffer market share losses. This will compel them back into the less contested business lending space, which will lubricate credit to companies. Indeed, I think the majors’ balance-sheet splits between residential and business loans will revert back to the 40:60 levels before the 1991 recession.

Our forecasts for double-digit house price growth in 2013 and 2014, and high single-digit growth in 2015, were spot on. My fourth idea is that there will be no imminent housing collapse, and the price of our bricks and mortar will again climb in 2016, albeit at a much slower pace of around 1 to 2 times income growth. I maintain the view that the market is very expensive (15 to 25 per cent above fair value) and recently sold my own home. The interest rate hikes that will be the catalyst for a sustained Aussie housing correction appear to have been shunted into the distant future.

A fifth idea is that as the US and UK jobless rates (5 and 5.3 per cent respectively) fall towards 3 per cent in 2016 and 2017, wage and consumer price inflation will gradually reanimate. While the Fed will hike in December, central banks will get behind the curve because of their desire to “look through” this reflation.

Fixed-rate bond prices to plummet

This prompts another idea, which is that fixed-rate bond prices will melt as long-term yields rise on the back of financial markets resisting the Fed’s dovish view of the world and acknowledging stubbornly strong inflation data. The existential moment for global central banks will arrive when the break-even inflation rates priced by the bond market begin breaching official inflation targets in a sign that investors no longer think that monetary policy (and so-called nominal growth targeting) is compatible with price stability. Asset allocators need to be short interest rate duration or, if you have to be exposed to this risk, hire a smart duration manager – they can be hard to find. Few people can consistently call rate changes right.

If this base case plays out, my seventh thought is that global equities will face tremendous headwinds as long-term risk-free rates (that is, government bond yields) mean-revert back to some semblance of normality, which means yields 50 per cent to 100 per cent higher than current marks. Recall that the 10-year government bond yield is an essential input as the underpinning for the discount rates in the valuation models for all listed and unlisted equity and real estate markets.

Sell ‘beta’ buy ‘alpha’

This insight furnishes an eighth idea, which is sell equity “beta” and buy “alpha”, as I advocated last year. Aussie shares (beta) have declined over the year to date while market-neutral and long-short hedge funds (alpha) have delivered terrific returns (at least the guys that I invest with have). This dynamic is unlikely to change.

In the more immediate term (over the next, say, one to two years), I like “spread” assets as the search for yield will remain a critical influence over investor behaviour as long as deposits do not offer any material “real” returns above inflation and equities continues to get hammered. One example is major bank subordinated bonds, which currently trade very cheaply on a global basis despite the majors being among the best capitalised banks globally, care of $33 billion of equity origination over the last 12 months. The credit ratings on major banks’ subordinated debt are on par with the senior bonds issued by Goldman Sachs, Morgan Stanley or Citigroup, and I think there is a decent chance they will get upgraded to the “A” band next year if Standard & Poor’s lifts the majors’ stand-alone credit profiles from “a” to “a+”, as it has signalled it may do.

A final thought is that if the world is once again forced to choose between elevated interest rates and high and volatile inflation, there is a possibility that the value of paper money will atrophy as a credible medium of exchange. This could precipitate a flight to safety in the form of a resurgence in the demand for gold as a hedge against the debasement of money by governments using the printing press to finance their own deficits.





Tags : , , , , ,