I just finished reading a good summary of 10 powerful and free or low-cost analytics tools. Of the 10 items mentioned, 2 are probably common in corporate environments (Excel and Tableau) while the other 8 are more specialized. I wonder how successful anyone is at introducing a new specialized analytics tool into an environment where Excel is the lingua franca?
I’ve personally run into this situation a few times. Recently I ran a Monte Carlo simulation in R to generate a simple P&L forecast for a business proposal. I followed good practices by putting my code into R-markdown and generating a PDF with the code, assumptions, variables, and output. I then had to share some of the conclusions with a colleague who was generating pricing models in a spreadsheet, however, and knowing that introducing R on short notice wouldn’t go over well, I just copied some of the key results into Excel and sent it off.
Similarly, a few months ago I was working on a group project to analyze some public data. I used R to understand the data and perform the analysis, but then had to reproduce the final analysis in Excel to share it with the team. That seems wasteful, but I couldn’t see another way to do it. It would have taken me quite a long time to do all the exploratory work in Excel (I’ve not yet figured out how to create several charts in a loop in Excel) and Excel just doesn’t have the same type of tools (like principal components for dimension reduction).
R does have some capabilities to talk Excel, but these don’t seem particularly easy to use and the advice is typically to use CSV as an interchange format, which has obvious limitations including the loss of formulas and formatting.
So I’m stumped. As long as Excel remains a standard interchange format I guess I’ll just have to do some manual data translation. Has anyone solved this problem in a more elegant way?