No, i am not running as root.á


2012/1/22 Zijad Purkovic <zijadpurkovic@gmail.com>
If your're running it with root, they should be in /var/log/flume.

On Sun, Jan 22, 2012 at 11:35 AM, DIPESH KUMAR SINGH
<dipesh.tech@gmail.com> wrote:
> I am using flume-0.9.4-cdh3u2 and hadoop-0.20.2-cdh3u2.
>
> Where usually the flume logs are located? I installed using tar.
> I checked in /tmp directory where flume has created few directories but that
> has no content.
>
> Thanks.
>
> 2012/1/22 Zijad Purkovic <zijadpurkovic@gmail.com>
>>
>> Try looking at flume logs on agent and collector. Also, what is your
>> Hadoop version? Cause, if you use some other version than cdh3, you
>> will need to add hadoop-core-"version".jar to your flume lib directory
>> to be able to write to hdfs.
>>
>> On Sun, Jan 22, 2012 at 4:26 AM, DIPESH KUMAR SINGH
>> <dipesh.tech@gmail.com> wrote:
>> > Thanks Zijad.
>> >
>> > I tried this :
>> >
>> > localhost: text("/etc/services") | agentSink("localhost",35853);
>> > localhost: collectorSource(35853) |
>> > collectorSink("hdfs://localhost:8020/home","dservices11.txt");
>> >
>> > But still, i cannot find my file written in HDFS and web interface shows
>> > the
>> > command state "SUCCEEDED".
>> > Since Web interface reflecting the state "SUCCEEDED" i am unable to
>> > figure
>> > out where things went wrong.
>> >
>> > --
>> > Dipesh
>> >
>> >
>> > On Sun, Jan 22, 2012 at 5:16 AM, Zijad Purkovic
>> > <zijadpurkovic@gmail.com>
>> > wrote:
>> >>
>> >> instead of dfs sink try using:
>> >> collectorSink("hdfs://localhost:8020/home","some_filename")
>> >>
>> >> On Sat, Jan 21, 2012 at 6:23 PM, DIPESH KUMAR SINGH
>> >> <dipesh.tech@gmail.com> wrote:
>> >> > Hi All,
>> >> >
>> >> > I am new to flume. I tried writing to HDFS through following command
>> >> > from
>> >> > command line and was able to successfully write in to HDFS:
>> >> >
>> >> > áflume node_nowatch -1 -s -n dump -c 'dump: text("/etc/services") |
>> >> > dfs("hdfs://localhost:8020/home/dservices.txt");'
>> >> >
>> >> > To try out the collector, I tried the following commands from
>> >> > multiconfig
>> >> > area of web interface :
>> >> >
>> >> > localhost: text("/etc/services") | agentSink("localhost",35853);
>> >> > localhost: collectorSource(35853) |
>> >> > dfs("hdfs://localhost:8020/home/dservices11.txt");
>> >> >
>> >> >
>> >> > I might be missing something basic in between.
>> >> > But, I am unable to write into HDFS, though the web interface
>> >> > displays
>> >> > the
>> >> > command has been "SUCCEEDED".
>> >> >
>> >> >
>> >> >
>> >> > Please guide, I wish to move data [ like plain text, PDF ] from a
>> >> > node
>> >> > to
>> >> > HDFS.
>> >> >
>> >> > Thanks!
>> >> >
>> >> > --
>> >> > Dipesh Kr. Singh
>> >> >
>> >> >
>> >> >
>> >> >
>> >>
>> >>
>> >>
>> >> --
>> >> Zijad PurkoviŠ
>> >> Dobrovoljnih davalaca krvi 3/19, ZavidoviŠi
>> >> 061/ 690 - 241
>> >
>> >
>> >
>> >
>> > --
>> > Dipesh Kr. Singh
>> >
>> >
>> >
>> >
>>
>>
>>
>> --
>> Zijad PurkoviŠ
>> Dobrovoljnih davalaca krvi 3/19, ZavidoviŠi
>> 061/ 690 - 241
>
>
>
>
> --
> Dipesh Kr. Singh
>
>
>
>



--
Zijad PurkoviŠ
Dobrovoljnih davalaca krvi 3/19, ZavidoviŠi
061/ 690 - 241



--
Dipesh Kr. Singh