Frequently Asked Questions - Server Issues

There are several ways to test a DODS server. One of the simplest ways is to use any web browser to look at a dataset. For a description of how to do this, see our description of how to access DODS data with a browser.

You can also test out a server by using the 'geturl' tool. It should be in your $DODS_ROOT/bin directory. To use 'geturl', give it the -d, -a, or -D option and the URL of your dataset. For instance,

Yet another way to test and debug a server is to directly connect to the HTTP server. Thus bypassing *all* interpretation of the return data stream, both from the DAP and the libcurl (HTTP) library. Here's the procedure. Type the following, making the substitutions for the server/URL you want to debug:

  • telnet <host with server> 80 (assuming the server is running on port 80, substitute the correct port otherwise).
  • GET <URL fragment following the hostname> HTTP/1.1
  • Host: <hostname>
  • <blank line>

Following the blank line you will see the response from the server, including the response headers.

What's going on here? This technique uses telnet to communicate with the HTTP daemon. The '80' causes telnet to use port 80 (the port normally used for HTTP). The first line instructs the HTTP daemon that it should GET the named document (the path used for the second argument is relative to the server's DocumentRoot) and use HTTP/1.1. Following the GET line there are one or more HTTP/1.1 headers which supply additional information to the server. HTTP/1.1 only requires that the Host: header be present. There are other request headers you can use. Consult the HTTP/1.1 specification.

Here is an example. The matching URL is http://dodsdev.gso.uri.edu/dods-test/nph-dods/data/nc/fnoc1.nc.dds.

        [jimg@comet jimg]$ telnet dodsdev.gso.uri.edu 80
        Trying 198.116.10.229...
        Connected to dodsdev.gso.uri.edu.
        Escape character is '^]'.
        GET /dods-test/nph-dods/data/nc/fnoc1.nc.dds HTTP/1.1
        Host: dodsdev.gso.uri.edu

        HTTP/1.0 200 OK
        XDODS-Server: DAP/3.4.2
        Date: Thu, 08 Jul 2004 17:01:59 GMT
        Last-Modified: Mon, 15 Apr 2002 22:49:39 GMT
        Content-type: text/plain
        Content-Description: dods_dds

        Dataset {
            Int16 u[time_a = 16][lat = 17][lon = 21];
            Int16 v[time_a = 16][lat = 17][lon = 21];
            Float32 lat[lat = 17];
            Float32 lon[lon = 21];
            Float32 time[time = 16];
        } fnoc1.nc;
        Connection closed by foreign host.
        

A few things to think about if your server isn't working:

Append `.ver' to a DODS URL. Or you can use `version' as the dataset name to get the version of a server without knowing any of the data file names.

returns an XML document that describes the server's software components and their versions in addition to the DAP protocol versions supported by the server. Older servers return a plain text document with less information.

OPeNDAP server software resides on the same computer (in general) as the data being served. The OPeNDAP server will use some lower-level web server software (e.g., Apache Server or Tomcat) to make the actual connection to the Internet. Most web server software creates and maintains a log file of all the incoming (http, e.g.) requests, and some indication of the success and/or failure mode of the response. The log files are often in a standardized form but are individually configurable by the system administrator.

Most sites that are interested in the usage of their OPeNDAP server(s) have developed scripts they run, typically once per month, to derive "metrics" of the usage of their server. These "metrics" might include: number of hits, number of successful hits, number of repeat users, and volume of data delivered.

A few details for Linux or unix machines running an Apache web server. A typical place to look for the web server logs is /usr/local/apache/logs but the actual location depends on how and where the server was installed. Check with your system administrator for the specifics on your machine.

You can find more information on Apache's server software at http://www.apache.org, including more information on log files.

There are also freeware log analysis packages available that could be useful. For instance, you can check out Analog logfile analysis at: http://www.analog.cx/

You are under no obligation to report your OPeNDAP server usage statistics to the OPeNDAP project. However, if you are willing to share this information, it will help us direct future software development efforts more effectively.

For the OPeNDAP C++ servers, you can enable the sharing of this information when you install your servers with the installServers script. When the install script asks about gathering access statistics, answer yes ("y") and provide the additional information requested. Your servers usage statistics will only be accessible by the local host (your machine) and the official OPeNDAP machine.

Your server has one or both of the following (easy to fix) problems. The cache directory is too small and/or your server cannot find gzip. If the cache directory is too small then files will be purged from the cache too soon resulting in a huge performance degradation. If your server cannot find gzip, it won't be able to decompress files before serving them.

When the OPeNDAP software is used to serve compressed files (e.g. files compressed using gzip), the files are first decompressed and then stored in a cache; data served are read from those cached files. The location of the cache directory is /usr/tmp by default. This can be changed by editing nph-dods and changing the value of $cache_dir. The software is set by default to limit the size of this directory to 50 MB. However, if you're serving large files, or are experiencing a large volume of traffic, you should increase this. To do so, edit the value of of the second parameter to 'purge_cache()' on line 125 of nph-dods. The cache size is given in MB, so changing the 50 to 100 would increase the cache size from 50MB to 100MB. Finally, the decompression software uses the gzip program to do its work. If your computer does not have the gzip program in its /bin directory you'll need to edit the DODS_Cache.pm so that the correct version of gzip is used. Look in that file for "/bin/gzip" and replace that text with the correct pathname. To figure out where gzip is on you computer, type 'which gzip' in a shell.

Here are a few suggestions for troubleshooting your FF server:

1) First, test that your format files are defined properly and that it fits your data files by running chkform on them. You can also check that your data is being interpreted as you desire by running newform. Both chkform and newform are FreeForm tools that come with the DODS FreeForm Server distributions. (They should be located in the bin/ directory of your distribution.)

So that newform can display your data, you will need an 'ASCII_output_data' section in your format description file. Like this:

ASCII_input_data "test"
Time 1 10 double 4
Test 12 33 enote 20

ASCII_output_data "test"
Time 1 10 double 4
Test 12 33 enote 16

2) If you are serving ASCII data, pay attention to the whitespace in your data files.

  • FreeForm will gag if whitespace extends beyond the line length determined by the format description. For instance, if the format descriptor fits
    • '34.523 1.45'
    FF will give an error on
    • '34.523 1.45 '
  • There needs to be whitespace filler if the data in a line doesn't cover the entire format. For instance, if the format descriptor fits this data
    • '34.456 234.456'
    FF will die on this data
    • '34.456 234.45'
    but will not die on
    • '34.456 234.45 '
  • The enote type doesn't seem to work quite like that. For instance, if the format fits
    • '34.456 2.34e-2'
    FF will die on
    • '34.456 2.3e-2 '

As of September 2003

The Matlab OPeNDAP Server was written when Matlab 4 was current. The server supports all of the data types Matlab supported at the time. The server was never updated to handle the newer data types (structures and cell arrays, e.g.) because there was no demand for that capability. So, when the current server encounters variables of the newer types in a file, it chokes.

The level of interest in adding these capabilities is currently unclear. If anyone is actively interested in these capabilities, please let us know at support@unidata.ucar.edu. The best way to move forward with this activity would be to find a champion for the Matlab server who can drive the development effort. We can provide them with as much help as they need, although experience with C++ would be required. If no one comes forward we can add it to our schedule but we're booked pretty tight for the next six months.

If you know of someone who might be interested in working to expand the capabilities of the Matlab server, please have them contact us at support@unidata.ucar.edu.

When I start my Aggregation Server, I get the following:

DEBUG: AggServer: CatalogServlet copy 
/usr/local/jakarta-tomcat/webapps/thredds/initialContent/dodsC/ to 
/usr/local/jakarta-tomcat-4.1.27-LE-jdk14/content/thredds/dodsC/ (04-04-19
10:57:08 )
DEBUG: AggServer: catalog config 
</usr/local/jakarta-tomcat-4.1.27-LE-jdk14/content/thredds/dodsC/catalogConfig.xml>
is not valid (04-04-19 10:57:09 )

From: Tony Jolibois <tjolibois at cls.fr>

The error didn't come from the catalog itself, but from the network configuration of my computer. In the configuration catalog of the AS server, there are some http URLs:

<!DOCTYPE catalog SYSTEM "http://www.unidata.ucar.edu/projects/THREDDS/xml/AggServerCatalog.dtd"> 
<catalog name="MERCATOR DODS Aggregation Server Catalog" version="0.6" 
      xmlns="http://www.unidata.ucar.edu/thredds" 
      xmlns:xlink="http://www.w3.org/1999/xlink"> 

My environment was this: I have a firewall, and my computer was not open to Internet, so it could not connect to the two sites http://www.unidata.ucar.edu and http://www.w3.org. I tested the local copy of AggServerCatalog.dtd and InvCatalog.0.6.dtd but it didn't work.

After opening the connection to these two URLs at the firewall, all works fine now.

Conclusion: if your computer cannot connect to these sites, you won't be able to run an Aggregating server.

Thanks Tony for tracking this down and providing this FAQ!

The ASCII and HTML responses are generated by accessing the server using the URL (a future version of the server may use a different design; this is true for version 3.4 and may be true for some later versions). If you are serving data behind a firewall which uses NAT for address traslation, the DNS lookup for the host name can fail.

Greg Miller from the USGS tracked down this problem and came up with a solution:

I am behind a firewall that uses NAT translation, so if it was relying on DNS to find the address, it would fail. I checked my host file and discovered that Red Hat maps the server name into the loopback address and not the IP address of the ethernet interface. I corrected the host file, and everything works fine.

Thanks Greg!