AFHood Group Blog The thoughtless yammerings of marketing junkies..


Exporting without double quotes

One of our junior programmers shot this one over after fighting with Proc Export for some time to remove the quotes from a string.

His need: export a single column of data without headers or quotes.

How he did it, quick and easy.


data _null_;
set inputdateset;
file "outputfilename";
put outputcolumnname;


Sometimes the simple solutions are right in front of us.


SQL tip – Inner join shorthand with USING

We write a LOT of SQL here and although SQL is a powerful database language, it can be tedious. So here is one tip for shortening all that typing.

Typical join sytax:

select some_columns

from one_table join another_table on one_table.column_1 = another_table.column_1 and one_table.column_2=another_table.column_2

where some_column > someother_column ;

Not too bad, right?

In order for this tip to work, you must be joining on columns with the same name (ie. column_1 and column_2 have the same name in both tables).

Here it is:

select some_columns

from one_table join another_table using (column_1, column_2)

where some_column > someother_column ;

Now, in our example it's not a drastic difference in coding, but as any programmer knows, this shorthand example doesn't accurately represent the real world. So give this syntax a try on your next project and let us know if it doesn't save you some typing.


SAS / Teradata Fastexport – dbsliceparm = all

Fastexport is the fastest way to get large data out of teradata. Fastexport utilizes multiple connections to deliver data and therefore speeding up the transfer of data between Teradata and SAS.

Here are a few examples of fastexport.

/* libname statement*/

libname teradb  teradata username=&un password=&pw dbsliceparm=all;

/* explicit sql */

proc sql;

connect to teradata(username=&un password=&pw dbsliceparm=all);



How do you know if fastexport was used?

Use this option:

options sastrace=',,,d' sastraceloc=saslog;

If it is working, you should see something in your log like:

Select was processed with fastexport.

There are many other factors that come into play if fastexport doesn't work. Check the requirements on the SAS support page for troubleshooting.


4 tips for integrating CRM technology

If you have worked with marketing technologies for any length of time, you've encountered this problem. How can we integrate our CRM with the rest of our organization? How does CRM interact with our call center, website, applications, database, loyalty system, etc., etc.

We've been there.

Throughout our travels, we've picked up a few tips. Hopefully they'll help you.

  1. Tight integration is dangerous. Just don't do it. - Marketing technology changes faster than fashion. Today's latest and greatest is quickly replaced with tomorrow's new market maker.
  2. Integrate through marketing concepts not technical application specs. - Abstracting technologies through the use marketing concepts like 'campaign', 'offer', 'contact history', 'registration', 'enrollment', etc. Define what these mean for the organization and build a data structure flexible enough for all applications to contribute to those concepts.
  3. Databases and services are the lifeline. - Long term success hinges on ease of use and flexibility. Simple services allow for plug and play scenarios. Well designed databases create environments that record interrelated transactions based on their role or concept. This makes business people happy. They can measure, analyze and predict without spending precious hours compiling, cleaning, and organizing data.
  4. Think replaceable. - Always remember that what you are adding to your technology stack could be replaced in a year and you will need to re-integrate something else. How much of your current work will be thrown away? Minimize the one-time development and maximize your efficiency and cost savings to the organization over the long haul.

We like to think we aren't integrating technology. We are simply making all of our applications play nicely together.

There is much more to this topic, but these 4 tips can minimize your stress for ears to come.


SAS fastload to Teradata

NOTE - SAS changed the default settings for fastload in SAS9. This has not been an improvement. SAS has documented how to change the setting back to the previous default in their note here... click here

The fastload facility for Teradata through SAS ACCESS is one that every analyst loading data into Teradata should be aware of. This tool allows you to pump data from SAS to Teradata in the quickest manner. The target table must be empty. There are other limitations as well, but most are uncommon problems.

To use the fastload option, follow the example below (SAS9 and above may require more manual settings to see max performance):

data td_lib.empty_table (bulkload=YES);

set source_table;


Another way...

proc append data=source_table


(bulkload=YES bl_log=append_err_output);


If your organization uses Teradata and could use some help utilizing the full power of SAS with Teradata, we are available to assist you.


Managing database connections with SAS

Here is one for the books.

When writing dynamic code or code that spawns other scripts, it may be important to think about how many database connections you are using. There are a couple of ways to insure you are a nice consumer of the database resources. One way that doesn't require any management would be to limit your connections with connection=global options.

Proc sql;

connect to teradata(user=user1 pw=XXXX server=myserv database=mydb connection=global);

<some sql statements here>

disconnect from teradata;


If you use multiple pass-through or pass-through and libname statements that all have the same connection values and have a connection=global option, SAS will not open subsequent connections to the database on each call. However, if you are accessing a difference database, schema, user, etc, SAS will need to make a new connection. There are many other connection parameters that are similar that should be explored if this is important to you or your client.

Here are a few:




DEFER (This is another good one for limiting your database activity for large, long running scripts)

We will also explore the world of checking for database connections. This is important for large scripting efforts. It will prevent your code from dying simply because there are no connections available. See more here.