[R] Application design.
538280 at gmail.com
Tue Jul 22 20:29:30 CEST 2014
Here are 2 possibilities to consider.
The shiny package allows a web browser interface to R. It can run
from a server, but can also run just on a single computer. You could
set this up like you suggest, where the user would double click on an
icon on the desktop which would then run R, load the package and open
the browser. The user can then make some selections in the browser
(subset the data, choose cutoffs, etc.) and the resulting graphs and
summaries will show up in the browser window. I don't know how hard
it is to copy and paste from the browser to power point, but viewing
will be nicely automatic (and possibly just printing or saving the
resulting web page may be enough to pass up the line).
The knitr package along with the pandoc program provides a system for
creating a template file that can then be processed to create a final
result. Again you could set up an icon on the desktop to run R and
process the template file. The template file would have the code for
querying the data and producing the graphs and summaries and also how
to display them. I don't think that knitr/pandoc currently has a way
to directly create power point slides, but it does have tools for
creating pdf or html based slides (I do the pdf ones quite often, yes
on windows) or you can produce a word document as the output that can
then easily be copied/pasted to power point (I do this as well when
clients want to do the copy/paste rather than having me create a
single pdf report).
Also, if you have not already, read the help page ?Startup, this gives
details on what R does as it starts and gives options for
automatically running code when R starts.
On Mon, Jul 21, 2014 at 8:24 PM, John McKown
<john.archie.mckown at gmail.com> wrote:
> I'm designing an R based application for my boss. It's not much, but
> it might save him some time. What it will be doing is reading data
> from an MS-SQL database and creating a number of graphs. At present,
> he must log into one server to run a vendor application to display the
> data in a grid. He then cuts this data and pastes it into an Excel
> spreadsheet. He then generates some graphs in Excel. Which he then
> cuts and pastes into a Power Point presentation. Which is the end
> result for distribution to others up the food chain.
> What I would like to do is read the MS-SQL data base using RODBC and
> create the graphs using ggplot2 instead of using Excel. I may end up
> being told to create an Excel file as well.
> My real question is organizing the R programs to do this. Basically
> what I was thinking of was a "master" program. It does the ODBC work
> and fetches the data into one, or more, data.frames. I was then
> thinking that it would be better to have separate source files for
> each graph produced. I would use the source() function in the "master"
> R program to load & execute each one in order. Is this a decent
> origination? Or would it be better for each "create a graph" R file to
> really just define a unique function which the "master" program would
> then invoke? I guess this latter would be a good way to keep the
> workspace "clean" since all the variables in the functinon would "go
> away" when the function ended.
> I guess what I'm asking is how others organize the R applications. Oh,
> I plan for this to be run by my boss by double clicking on the
> "master" R source file, which I will associate with the Rscript
> program in Windows. Yes, this is Windows based <sigh/>.
> Appreciate your thoughts. Especially if I'm really off track.
> There is nothing more pleasant than traveling and meeting new people!
> Genghis Khan
> Maranatha! <><
> John McKown
> R-help at r-project.org mailing list
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
Gregory (Greg) L. Snow Ph.D.
538280 at gmail.com
More information about the R-help