CARVIEW |
Select Language
HTTP/2 302
server: nginx
date: Sun, 17 Aug 2025 02:25:31 GMT
content-type: text/plain; charset=utf-8
content-length: 0
x-archive-redirect-reason: found capture at 20090901170447
location: https://web.archive.org/web/20090901170447/https://github.com/stuartsierra/clojure-hadoop/tree
server-timing: captures_list;dur=0.650968, exclusion.robots;dur=0.028841, exclusion.robots.policy;dur=0.015731, esindex;dur=0.020511, cdx.remote;dur=6.621117, LoadShardBlock;dur=134.558944, PetaboxLoader3.datanode;dur=45.606882, PetaboxLoader3.resolve;dur=24.676110
x-app-server: wwwb-app200
x-ts: 302
x-tr: 169
server-timing: TR;dur=0,Tw;dur=0,Tc;dur=0
set-cookie: wb-p-SERVER=wwwb-app200; path=/
x-location: All
x-rl: 0
x-na: 0
x-page-cache: MISS
server-timing: MISS
x-nid: DigitalOcean
referrer-policy: no-referrer-when-downgrade
permissions-policy: interest-cohort=()
HTTP/2 301
server: nginx
date: Sun, 17 Aug 2025 02:25:31 GMT
content-type: text/html; charset=utf-8
content-length: 123
x-archive-orig-server: nginx/0.6.31
x-archive-orig-date: Tue, 01 Sep 2009 17:04:43 GMT
x-archive-orig-connection: close
x-archive-orig-status: 301 Moved Permanently
location: https://web.archive.org/web/20090901170447/https://github.com/stuartsierra/clojure-hadoop/tree/master
x-archive-orig-x-runtime: 1203ms
x-archive-orig-cache-control: no-cache
x-archive-orig-content-length: 123
cache-control: max-age=1800
memento-datetime: Tue, 01 Sep 2009 17:04:47 GMT
link: ; rel="original", ; rel="timemap"; type="application/link-format", ; rel="timegate", ; rel="first memento"; datetime="Tue, 01 Sep 2009 17:04:47 GMT", ; rel="memento"; datetime="Tue, 01 Sep 2009 17:04:47 GMT", ; rel="last memento"; datetime="Tue, 01 Sep 2009 17:04:47 GMT"
content-security-policy: default-src 'self' 'unsafe-eval' 'unsafe-inline' data: blob: archive.org web.archive.org web-static.archive.org wayback-api.archive.org athena.archive.org analytics.archive.org pragma.archivelab.org wwwb-events.archive.org
x-archive-src: 52_11_20090901162243_crawl101.gpg-c/52_11_20090901170434_crawl101.arc.gz
server-timing: captures_list;dur=0.611765, exclusion.robots;dur=0.028766, exclusion.robots.policy;dur=0.017286, esindex;dur=0.015181, cdx.remote;dur=10.838217, LoadShardBlock;dur=180.877058, PetaboxLoader3.datanode;dur=123.509621, PetaboxLoader3.resolve;dur=124.911235, load_resource;dur=141.331240
x-app-server: wwwb-app200
x-ts: 301
x-tr: 359
server-timing: TR;dur=0,Tw;dur=0,Tc;dur=0
x-location: All
x-rl: 0
x-na: 0
x-page-cache: MISS
server-timing: MISS
x-nid: DigitalOcean
referrer-policy: no-referrer-when-downgrade
permissions-policy: interest-cohort=()
HTTP/2 200
server: nginx
date: Sun, 17 Aug 2025 02:25:33 GMT
content-type: text/html; charset=utf-8
x-archive-orig-server: nginx/0.6.31
x-archive-orig-date: Tue, 01 Sep 2009 17:04:47 GMT
x-archive-orig-connection: close
x-archive-orig-status: 200 OK
x-archive-orig-x-runtime: 305ms
x-archive-orig-etag: "2845bb5de7b8f246e4fc0769f1ad414a"
x-archive-orig-cache-control: private, max-age=0, must-revalidate
x-archive-orig-content-length: 25586
x-archive-guessed-content-type: text/html
x-archive-guessed-charset: utf-8
memento-datetime: Tue, 01 Sep 2009 17:04:47 GMT
link: ; rel="original", ; rel="timemap"; type="application/link-format", ; rel="timegate", ; rel="first memento"; datetime="Sun, 30 Aug 2009 14:56:40 GMT", ; rel="prev memento"; datetime="Sun, 30 Aug 2009 14:56:40 GMT", ; rel="memento"; datetime="Tue, 01 Sep 2009 17:04:47 GMT", ; rel="last memento"; datetime="Tue, 01 Sep 2009 17:04:47 GMT"
content-security-policy: default-src 'self' 'unsafe-eval' 'unsafe-inline' data: blob: archive.org web.archive.org web-static.archive.org wayback-api.archive.org athena.archive.org analytics.archive.org pragma.archivelab.org wwwb-events.archive.org
x-archive-src: 52_11_20090901162243_crawl101.gpg-c/52_11_20090901170434_crawl101.arc.gz
server-timing: captures_list;dur=0.470489, exclusion.robots;dur=0.022483, exclusion.robots.policy;dur=0.013086, esindex;dur=0.009387, cdx.remote;dur=8.047549, LoadShardBlock;dur=167.102320, PetaboxLoader3.datanode;dur=80.047696, PetaboxLoader3.resolve;dur=1056.829760, load_resource;dur=978.529450
x-app-server: wwwb-app200
x-ts: 200
x-tr: 1213
server-timing: TR;dur=0,Tw;dur=0,Tc;dur=0
x-location: All
x-rl: 0
x-na: 0
x-page-cache: MISS
server-timing: MISS
x-nid: DigitalOcean
referrer-policy: no-referrer-when-downgrade
permissions-policy: interest-cohort=()
content-encoding: gzip
stuartsierra's clojure-hadoop at master - GitHub
This repository is private.
All pages are served over SSL and all pushing and pulling is done over SSH.
No one may fork, clone, or view it unless they are added as a member.
Every repository with this icon (
) is private.
Every repository with this icon (

This repository is public.
Anyone may fork, clone, or view it.
Every repository with this icon (
) is public.
Every repository with this icon (

Description: | Library to aid writing Hadoop jobs in Clojure. edit |
Homepage: | edit |
Public Clone URL: |
git://github.com/stuartsierra/clojure-hadoop.git
Give this clone URL to anyone.
git clone git://github.com/stuartsierra/clojure-hadoop.git
|
Your Clone URL: |
Use this clone URL yourself.
git clone git@github.com:stuartsierra/clojure-hadoop.git
|
name | age | message | |
---|---|---|---|
![]() |
.gitignore | Tue Jul 28 10:03:56 -0700 2009 | .gitignore: added lib, classes, & my REPL script [stuartsierra] |
![]() |
README.txt | Wed Jul 29 09:38:58 -0700 2009 | README.txt: added note that requires Java 6 [stuartsierra] |
![]() |
build.xml | Wed Jul 29 09:02:13 -0700 2009 | build.xml: added wordcount3 [stuartsierra] |
![]() |
epl-v10.html | Mon Jul 27 09:35:45 -0700 2009 | Initial commit; README.txt and license [stuartsierra] |
![]() |
examples/ | Wed Jul 29 10:03:37 -0700 2009 | wordcount3.clj: added documentation [stuartsierra] |
![]() |
src/ | Wed Jul 29 09:06:51 -0700 2009 | config.clj and job.clj: move replace output fn ... [stuartsierra] |
![]() |
test/ | Mon Jul 27 10:14:04 -0700 2009 | imports.clj: added convenience fns to import Ha... [stuartsierra] |
README.txt
clojure-hadoop An library to assist in writing Hadoop MapReduce jobs in Clojure. by Stuart Sierra https://stuartsierra.com/ For more information on Clojure, https://clojure.org/ on Hadoop, https://hadoop.apache.org/ Copyright (c) Stuart Sierra, 2009. All rights reserved. The use and distribution terms for this software are covered by the Eclipse Public License 1.0 (https://opensource.org/licenses/eclipse-1.0.php) which can be found in the file epl-v10.html at the root of this distribution. By using this software in any fashion, you are agreeing to be bound by the terms of this license. You must not remove this notice, or any other, from this software. DEPENDENCIES This library requires Java 6. In order to compile and use this library, you will need the following JAR files in your classpath: 1. clojure.jar 2. hadoop-0.18.3-core.jar 3. Dependent JARs included with Hadoop (such as commons-logging) This code was developed with Hadoop MapReduce 0.18.3, although it should work with any later version. You can download the Hadoop distribution by visiting this web page: https://www.apache.org/dyn/closer.cgi/hadoop/core/hadoop-0.18.3/ and selecting a mirror close to you. COMPILING (optional for layers 1-3, required for layer 4) 1. Create a "lib" directory (in the same directory as this README) and copy the JARs listed above into it. 2. Run "ant" USING THE LIBRARY This library provides different layers of abstraction away from the raw Hadoop API. Layer 1: clojure-hadoop.imports Provides convenience functions for importing the many classes and interfaces in the Hadoop API. Layer 2: clojure-hadoop.gen Provides gen-class macros to generate the multiple classes needed for a MapReduce job. See the file "examples/wordcount1.clj" for a demonstration of these macros. Layer 3: clojure-hadoop.wrap clojure-hadoop.wrap: provides wrapper functions that automatically convert between Hadoop Text objects and Clojure data structures. See the file "examples/wordcount2.clj" for a demonstration of these wrappers. Layer 4: clojure-hadoop.job Provides a complete implementation of a Hadoop MapReduce job that can be dynamically configured to use any Clojure functions in the map and reduce phases. See the file "examples/wordcount3.clj" for a demonstration of this usage.
This feature is coming soon. Sit tight!