1. First you have to assign a static ip address to your interface card. In case of multiple cards, at least the interface where a client would be connected should get a static ip.
2. Then write a /etc/dhcpd.conf file for description of subnets, routers for subnets and various other configs.
This one's a good howto:
http://tldp.org/HOWTO/DHCP/x369.html
3. Start dhcp server with /usr/sbin/dhcpd
For debugging you can start it in forground with /usr/sbin/dhcpd -f -d (-d logs standard error messages)
Another good resource:
http://www.chinalinuxpub.com/doc/www.siliconvalleyccie.com/linux-hn/dchp.htm
Friday, April 16, 2010
Sunday, April 11, 2010
dbus
dbus is a message bus system. It is an 'interprocess communication mechanism'.
why dbus? :-
1. It provides both method call as well as 'signal' based IPC mechanism.
2. Allows communication between 'desktop applications' between same session as well as desktop session and operation system.
3. Any application can come at any time and register with 'dbus daemon' for publishing 'services' and any application can come at any time and subscribe for a given service. Hence provides great IPC flexibility to the applications.
Mechanism? :-
Each application is represented as an object when considered on dbus (e.g. GObject when using glib binding). For addressing, a name is given to each object. Its called 'object path'. Object path could be like '/abc/def' but it can be designed as you may like. Dbus has a heavily tested api. But they recommend using wrappers (e.g. glib) for using dbus functionality. Internally dbus daemon and applications would communicate using sockets. And it is estimated that communication between 2 applications would be twice as slow as direct communication between them. (Makes sense since it would be 2 socket pairs as opposed to 1 in direct communication case)
Example:
I am outlining a glib binding based hello world example for asynchronous signal based communication.
Steps:
1. Describe your object interface (that is interested in providing any services (methods that can be called by other apps or signals that this application will emit) in an XML file.
** - assume - to be '<' or '>' in above code. sorry, working on it!!
Here, 'a.b.ciface' is interface name, 'sig1' is the name of signal that your app will emit. x,y,z,w are 'arguments' of your signal. Notice the 'type' field in arguments. Here 'i' is for integer, 'y' is for byte etc. Complete list can be found in dbus specifications [1]. Similarly methods can be described.
2. Generate server and client side header files using 'dbus-binding-tool'
$dbus-binding-tool --prefix=pqr --mode=glib-server app_iface.xml --output=server-stub.h --force
$dbus-binding-tool --prefix=pqr --mode=glib-client app_iface.xml --output=client-stub.h --force
Note that prefix is important and this will be 'prefixed' to your method (stub) name and hence has to be referred accordingly in server/client code.
3. Most of the time, you will have to generate 'stub' for your type signature (for some basic type signatures, in-build functionality is available. Check dev-help [2] before you proceed). e.g. in above interface description, I am sending {long,long,char,string}. The stub (.c and .h) can be generated using a tool called 'glib-genmarshal'. The tool's man page is good resource along with examples to get it done.
4. Now, you are ready to write server and client code. I will just provide basic template here:
-------------------------------------------------------------
Server:
1. initialize g_type - g_type_init()
2. get connection to bus - dbus_g_bus_get()
3. create a proxy object to talk to bus itself - dbus_g_proxy_new_for_name()
4. register your service with dbus bus - dbus_g_proxy_call()
5. create a new g_object - g_object_new()
6. Associate dbus object path with this object - dbus_g_connection_register_g_object()
7. continue your code. whenever you want to fire a signal, call - g_signal_emit()
8. One important step - you have to register stub signature that you created using glib-genmarshal. This has to be done in 'class_init()' function using 'dbus_g_object_register_marshaller'. Refer to example code for this.
-------------------------------------------------------------
There is actually more work to be done. But its better to look at sample code than explain here. So, refer to [3] for detailed example.
-------------------------------------------------------------
client:
1. Get a connection to bus - dbus_g_bus_get()
2. create a proxy object for server object - dbus_g_proxy_new_for_name()
3. Register your interest in particular signals - dbus_g_proxy_add_signal()
4. Connect a callback function for receiving/processing received signal - dbus_g_proxy_connect_signal()
--------------------------------------------------------------
References:
1. http://dbus.freedesktop.org/doc/dbus-specification.html#message-protocol-signatures
2. http://live.gnome.org/devhelp
3. http://maemo.org/development/training/maemo_platform_development_content/plain_html/ (This one's is very good and most helpful resource)
why dbus? :-
1. It provides both method call as well as 'signal' based IPC mechanism.
2. Allows communication between 'desktop applications' between same session as well as desktop session and operation system.
3. Any application can come at any time and register with 'dbus daemon' for publishing 'services' and any application can come at any time and subscribe for a given service. Hence provides great IPC flexibility to the applications.
Mechanism? :-
Each application is represented as an object when considered on dbus (e.g. GObject when using glib binding). For addressing, a name is given to each object. Its called 'object path'. Object path could be like '/abc/def' but it can be designed as you may like. Dbus has a heavily tested api. But they recommend using wrappers (e.g. glib) for using dbus functionality. Internally dbus daemon and applications would communicate using sockets. And it is estimated that communication between 2 applications would be twice as slow as direct communication between them. (Makes sense since it would be 2 socket pairs as opposed to 1 in direct communication case)
Example:
I am outlining a glib binding based hello world example for asynchronous signal based communication.
Steps:
1. Describe your object interface (that is interested in providing any services (methods that can be called by other apps or signals that this application will emit) in an XML file.
-node-
-interface name="a.b.ciface"-
-signal name="sig1"-
-arg type="i" name="x" direction="out"-
-arg type="i" name="y" direction="out"-
-arg type="y" name="z" direction="out"-
-arg type="s" name="w" direction="out"--/signal-
-/interface-
-/node-
2. Generate server and client side header files using 'dbus-binding-tool'
$dbus-binding-tool --prefix=pqr --mode=glib-server app_iface.xml --output=server-stub.h --force
$dbus-binding-tool --prefix=pqr --mode=glib-client app_iface.xml --output=client-stub.h --force
Note that prefix is important and this will be 'prefixed' to your method (stub) name and hence has to be referred accordingly in server/client code.
3. Most of the time, you will have to generate 'stub' for your type signature (for some basic type signatures, in-build functionality is available. Check dev-help [2] before you proceed). e.g. in above interface description, I am sending {long,long,char,string}. The stub (.c and .h) can be generated using a tool called 'glib-genmarshal'. The tool's man page is good resource along with examples to get it done.
4. Now, you are ready to write server and client code. I will just provide basic template here:
-------------------------------------------------------------
Server:
1. initialize g_type - g_type_init()
2. get connection to bus - dbus_g_bus_get()
3. create a proxy object to talk to bus itself - dbus_g_proxy_new_for_name()
4. register your service with dbus bus - dbus_g_proxy_call()
5. create a new g_object - g_object_new()
6. Associate dbus object path with this object - dbus_g_connection_register_g_object()
7. continue your code. whenever you want to fire a signal, call - g_signal_emit()
8. One important step - you have to register stub signature that you created using glib-genmarshal. This has to be done in 'class_init()' function using 'dbus_g_object_register_marshaller'. Refer to example code for this.
-------------------------------------------------------------
There is actually more work to be done. But its better to look at sample code than explain here. So, refer to [3] for detailed example.
-------------------------------------------------------------
client:
1. Get a connection to bus - dbus_g_bus_get()
2. create a proxy object for server object - dbus_g_proxy_new_for_name()
3. Register your interest in particular signals - dbus_g_proxy_add_signal()
4. Connect a callback function for receiving/processing received signal - dbus_g_proxy_connect_signal()
--------------------------------------------------------------
References:
1. http://dbus.freedesktop.org/doc/dbus-specification.html#message-protocol-signatures
2. http://live.gnome.org/devhelp
3. http://maemo.org/development/training/maemo_platform_development_content/plain_html/ (This one's is very good and most helpful resource)
Thursday, April 8, 2010
compilation process
yes, we have learned standard compilation phases since childhood. Still when $gcc hello.c (hmm.. considering linux platform) produces executable, some questions may confuse you. for e.g. standard questions asked to a newbie is 'what is there in stdio.h?'. Now a dumb one would say definition of functions like printf() which is wrong!! but a smart guy will tell that it contains 'declarations' not 'definitions'! Correct.. but then.. where are the definitions? If I go and call a function func() in my code which is 'declared' in hello.h that I have included, it still gives error.. it can not find the definiton. then what about printf?
The answer is - there are some predefined libraries that are linked and predefined path/s that are searched for header files and libraries by gcc. The function definitions are found at such places. By default gcc searches for header files at:
/usr/local/include/
/usr/include/
And for libraries at:
/usr/local/lib/
/usr/lib/
The standard c library is called 'libc.a'. On a unix system, it is usually located at '/usr/lib/libc.a'. the '.a' is archive file that can be created using a utility 'ar' (haven't heard? try cross-compilation once and you will get to know it). This can also be suppressed using '-nodefaultlibs' switch to gcc.
Well, that clarifies standard header files, standard functions and pre-included libraries. But things start to get more and more complicated when you move from 'all code and headers in single file' to 'all code and headers in multiple files in same directory' to ' code and headers scattered in multiple files across multiple directories'. First you get 'file not found' then 'no declaration' then 'no definition' and finally 'multiple definitions!!!!' errors!!
For successful compilation, GCC needs
1. to find all the files mentioned as arguments.
2. to find all the header files that you have included.
3. to find all the function declarations of functions that you have used ONLY ONCE*
4. to find all the function definitions of function that you have used ONLY ONCE*
5. to find all the libraries needed (yeah.. would be covered in above points.. still)
How do we achieve this?
1. to find header files: '-I' option is used to provide additional search path for headers. Hence if you have header files stored at '/x/y/z' then you can mention 'gcc -I/x/y/z ...' (The '...' here doesn't have any special meaning.. just a place holder) to make gcc look in '/x/y/z' for searching needed header files.
2. to find source files: well, as far as I know, by default only current directory is searched for '.c' files. Hence if you have source file src2.c in '/a/b/c/src2.c' then you have to use complete path e.g. 'gcc -I/x/y/z /a/b/c/src2.c ...' for compilation. (I donno if a gcc switch is available or not for searching c files)
3. to find libraries: First you have to link those libraries with '-l' option. for e.g. if your library name is 'libpqr.a' (I think its a convention to name a library file as 'libYOURNAME.a'.. donno for sure), then use -lpqr to link it. e.g. 'gcc -I/x/y/z /a/b/c/src2.c -lpqr...'. And for search path - if they are located in /l/m/n/libpqr.a' then use '-L' switch to specify this path - 'gcc -I/x/y/z /a/b/c/src2.c -L/l/m/n -lpqr ...'
4. For multiple declarations (very imp - gave a lot of pain to my bottom few years ago): use #ifndef, #define, #endif in your header file. e.g. if you have a file 'header1.h' and if you have included it in 'src1.c' and 'src2.c' then it will give redeclaration error if above 3 are not used.
Hence, in your header file, use:
#ifndef _HEADER1_
#define _HEADER1_
//your declarations
#endif
to avoid redeclaration error.
5. To find all function definitions: it must exist either in your source file or in included library. To find it only once .. well write it only once!!
Yup.. thats all I have to say right now!
The answer is - there are some predefined libraries that are linked and predefined path/s that are searched for header files and libraries by gcc. The function definitions are found at such places. By default gcc searches for header files at:
/usr/local/include/
/usr/include/
And for libraries at:
/usr/local/lib/
/usr/lib/
The standard c library is called 'libc.a'. On a unix system, it is usually located at '/usr/lib/libc.a'. the '.a' is archive file that can be created using a utility 'ar' (haven't heard? try cross-compilation once and you will get to know it). This can also be suppressed using '-nodefaultlibs' switch to gcc.
Well, that clarifies standard header files, standard functions and pre-included libraries. But things start to get more and more complicated when you move from 'all code and headers in single file' to 'all code and headers in multiple files in same directory' to ' code and headers scattered in multiple files across multiple directories'. First you get 'file not found' then 'no declaration' then 'no definition' and finally 'multiple definitions!!!!' errors!!
For successful compilation, GCC needs
1. to find all the files mentioned as arguments.
2. to find all the header files that you have included.
3. to find all the function declarations of functions that you have used ONLY ONCE*
4. to find all the function definitions of function that you have used ONLY ONCE*
5. to find all the libraries needed (yeah.. would be covered in above points.. still)
How do we achieve this?
1. to find header files: '-I' option is used to provide additional search path for headers. Hence if you have header files stored at '/x/y/z' then you can mention 'gcc -I/x/y/z ...' (The '...' here doesn't have any special meaning.. just a place holder) to make gcc look in '/x/y/z' for searching needed header files.
2. to find source files: well, as far as I know, by default only current directory is searched for '.c' files. Hence if you have source file src2.c in '/a/b/c/src2.c' then you have to use complete path e.g. 'gcc -I/x/y/z /a/b/c/src2.c ...' for compilation. (I donno if a gcc switch is available or not for searching c files)
3. to find libraries: First you have to link those libraries with '-l' option. for e.g. if your library name is 'libpqr.a' (I think its a convention to name a library file as 'libYOURNAME.a'.. donno for sure), then use -lpqr to link it. e.g. 'gcc -I/x/y/z /a/b/c/src2.c -lpqr...'. And for search path - if they are located in /l/m/n/libpqr.a' then use '-L' switch to specify this path - 'gcc -I/x/y/z /a/b/c/src2.c -L/l/m/n -lpqr ...'
4. For multiple declarations (very imp - gave a lot of pain to my bottom few years ago): use #ifndef, #define, #endif in your header file. e.g. if you have a file 'header1.h' and if you have included it in 'src1.c' and 'src2.c' then it will give redeclaration error if above 3 are not used.
Hence, in your header file, use:
#ifndef _HEADER1_
#define _HEADER1_
//your declarations
#endif
to avoid redeclaration error.
5. To find all function definitions: it must exist either in your source file or in included library. To find it only once .. well write it only once!!
Yup.. thats all I have to say right now!
Wednesday, April 7, 2010
autotools
I currently work with a large build system that has recursive make files, configure scripts, makefile.am's, makefile.in's etc. So my work involves spending 1 unit of time in writing the code and 2 to 3 units of time in trying to make it compile with the jargon.
Hence I decided to look a bit deep into it. Here is what I thought should be noted down.
Autotools are mainly useful when you are doing cross platform development. It is a set of 3 tools:
1. Autoconf: Creates a configure script that analyzes the system at compile time. e.g. whether 'cc' is used or 'gcc'
2. Automake: Generates makefile that will compile your code.
3. Libtools: Used to create shared libraries, platform independently
Summary of the build process:
./configure -> make -> make install :: is the standard build process.
1. "configure" script has to be generated. It is generated by 'autoconf'.
-- 'autoconf' needs a file called 'configure.ac' for this. Writing configure.ac by hand is difficult. Hence 'autoscan' is used
-- autoscan (run w/o arguments on shell - $autoscan <-|) generates 'configure.scan' You will mostly rename it to be configure.ac
-- running 'autoconf' in a directory where 'configure.ac' is present creates 'configure' script. (Other files are generated in the process. Ignore them for time being)
-- Now we have 'configure' file and can run './configure' BUT 'configure' script uses a file called 'Makefile.in' to generate makefile. Hence this 'Makefile.in' needs to be present.
-- You will also need a 'config.h.in' if you want to make your program portable. That is done using 'autoheader' ($autoheader <-|). Then you can modify config.h as you want.
2. Makefile has to be generated. As mentioned above it is generated by 'configure' script by using 'Makefile.in'
-- Makefile.in is generated by 'automake' tool by using a file called 'Makefile.am'. 'Makefile.am' has to be written manually :-) (Writing 'Makefile.am' needs a bigger explanation. so, lets assume here that its written)
-- In the same directory $automake<-| -> $aclocal<-| (aclocal generates some needed macros)
-- If automake generates errors, you will have to modify 'configure.ac' and rerun 'aclocal' and 'autoconf'
3. Finally, 1. ./configure to generate 'Makefile' -> 2. Make 3. Make install
Resources:
1. http://markuskimius.wikidot.com/programming:tut:autotools
2. http://sources.redhat.com/autobook/autobook/autobook_toc.html
Hence I decided to look a bit deep into it. Here is what I thought should be noted down.
Autotools are mainly useful when you are doing cross platform development. It is a set of 3 tools:
1. Autoconf: Creates a configure script that analyzes the system at compile time. e.g. whether 'cc' is used or 'gcc'
2. Automake: Generates makefile that will compile your code.
3. Libtools: Used to create shared libraries, platform independently
Summary of the build process:
./configure -> make -> make install :: is the standard build process.
1. "configure" script has to be generated. It is generated by 'autoconf'.
-- 'autoconf' needs a file called 'configure.ac' for this. Writing configure.ac by hand is difficult. Hence 'autoscan' is used
-- autoscan (run w/o arguments on shell - $autoscan <-|) generates 'configure.scan' You will mostly rename it to be configure.ac
-- running 'autoconf' in a directory where 'configure.ac' is present creates 'configure' script. (Other files are generated in the process. Ignore them for time being)
-- Now we have 'configure' file and can run './configure' BUT 'configure' script uses a file called 'Makefile.in' to generate makefile. Hence this 'Makefile.in' needs to be present.
-- You will also need a 'config.h.in' if you want to make your program portable. That is done using 'autoheader' ($autoheader <-|). Then you can modify config.h as you want.
2. Makefile has to be generated. As mentioned above it is generated by 'configure' script by using 'Makefile.in'
-- Makefile.in is generated by 'automake' tool by using a file called 'Makefile.am'. 'Makefile.am' has to be written manually :-) (Writing 'Makefile.am' needs a bigger explanation. so, lets assume here that its written)
-- In the same directory $automake<-| -> $aclocal<-| (aclocal generates some needed macros)
-- If automake generates errors, you will have to modify 'configure.ac' and rerun 'aclocal' and 'autoconf'
3. Finally, 1. ./configure to generate 'Makefile' -> 2. Make 3. Make install
Resources:
1. http://markuskimius.wikidot.com/programming:tut:autotools
2. http://sources.redhat.com/autobook/autobook/autobook_toc.html
Subscribe to:
Comments (Atom)