C++ "proper" include file structure

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,605
4,525
75
Also I'm not sure what is the reasoning that you should not just set it up so a single g++ command can compile the program, cmake and all these things just seem to overcomplicate the entire process, I'm just trying to figure out what the advantage actually is.
Theoretically, if you set things up right, when you change just one part of your program, you can compile just that part to a new .obj file. Then you can link it together with the old .obj files for other parts of your program without recompiling them. On large projects this can save a lot of time.
 

Essence_of_War

Platinum Member
Feb 21, 2013
2,650
4
81
Theoretically, if you set things up right, when you change just one part of your program, you can compile just that part to a new .obj file. Then you can link it together with the old .obj files for other parts of your program without recompiling them. On large projects this can save a lot of time.

I mean, I wouldn't say theoretically as though this was some sort of black magic that one can only get to work by sacrificing an appropriate number of chickens. It is default behavior. Make builds only the things that are out of date, and it sorts and analyzes the dependencies to build the out-of-date things with their correct dependencies and only to build them once. This is not trivial for complicated programs and it really is pretty cool, and as you pointed out, really can save LOTS of time.

Also I'm not sure what is the reasoning that you should not just set it up so a single g++ command can compile the program, cmake and all these things just seem to overcomplicate the entire process, I'm just trying to figure out what the advantage actually is.

I'm very confused about how you've come to this conclusion from the discussion we've had so far. Maybe it would be helpful to explain what specifically about make and makefiles you think is over complicating to your workflow?
 

Crusty

Lifer
Sep 30, 2001
12,684
2
81
Theoretically, if you set things up right, when you change just one part of your program, you can compile just that part to a new .obj file. Then you can link it together with the old .obj files for other parts of your program without recompiling them. On large projects this can save a lot of time.

This x1000. Using a proper build system gives you automated dependency management. Change a .cpp file and it will only compile that translation unit, then re-link the new object file to any exes/libraries that depend on it. The entire point is to reduce the amount of compiling done when code is changed.

For projects where all source code is in one directory and we're talking about < 10k lines of code I could see getting by with a very minimal set of compiling commands by hand...

I've worked on a project where the shared library built was over 100MB and we had over 200 exes to link(lots and lots of test files). There was no way we would have survived if we didn't have a very robust and complex make system. You could do a clean optimized build including a full test run in about 20-30 minutes depending on how busy the machine was. For reference the dev machine was a 12-core, 48GB ram beast with a huge SAS array attached.

However, if you just wanted to work on the library code and run a single test file you could compile the thing in under 30s after your initial build of the library. All because it was smart enough to only compile the affected translation units and re-use the majority of the already compiled object files during linking.

Incremental building is a tremendous timesaver, there is a reason every single IDE out there implements it.
 

Merad

Platinum Member
May 31, 2010
2,586
19
81
Also I'm not sure what is the reasoning that you should not just set it up so a single g++ command can compile the program, cmake and all these things just seem to overcomplicate the entire process, I'm just trying to figure out what the advantage actually is.

To be frank that's because you don't really understand how C/C++ compilation and linking work. That's ok, most people out there writing C and C++ probably don't have a solid understand of it either... but they were taught how to compile correctly, even if they don't know why.

Others have covered the incremental compilation part. That's a big deal, and you don't necessarily have to have a large project to see huge advantages from it. I did a project a couple of years back on the Raspberry Pi. The project was small (~5 KLOC C++) but the Pi is slow. A full rebuild of the project easily too 7-10 minutes.

The another important issue that I've mentioned in previous posts is that your approach prevents standards compliant C or C++ code from compiling correctly, and has the potential for introducing subtle hard to find bugs.

This is really a case where you should just trust the fact that we're all telling you to switch and do it, especially if you have any desire to ever work on a team with other C/C++ devs or share your projects. C and C++ are designed to use separate compilation; the fact that your approach works most of the time should be regarded as a happy coincidence, at best.
 

Red Squirrel

No Lifer
May 24, 2003
70,075
13,528
126
www.anyf.ca
Ok well this vered off topic as originally was about file structure more than what tools are used to compile. But figured I'd update, so what I ended doing is going with the original recommended structure I found online, and that's to have every class, function etc in it's own .h and .cpp file respectively. The .h has the prototype of the class/function and the cpp file has the actual code that goes with it. In some cases I might cheat a bit and have a "functions.h" and "functions.cpp" that will just have a bunch of basic functions instead of splitting them up. Each .cpp file includes it's own .h file, and potentially any other .h that may be required but typically forward declaration can solve that so I expect the amount of extra includes to be small. Now, I'm not sure if I should also include all the stuff like iostream in those, or leave that to the main program, I think I will leave those out as it's a given they will be included in the main program. Though if I was to include all that stuff, it would make it possible to compile any .cpp file on it's own.

As far as bringing all the .cpp files together so the main program sees them when I compile, I will use my inventory app which simply goes through my "includes" folder and generates a sources.h file that includes all the .cpp files. I used to also have a includes.h that included all the .h, but I wont need that anymore. The program still has an option to do it but it's off by default. I then include the .h file in my main program below the system includes.

Eventually I will still look into make scripts that have recursive compiling, but I use my inventory program anyway, as it takes care of tagging each file with a header with author and other basic stats like lines of code so either way I'd probably be using my program. This program just kills two birds with one stone basically. It creates headers similar to this:

Code:
/**********************************************************
PostSort C++ source file
No copyright - feel free to distribute, modify etc...
Last modified by Red Squirrel on Jan-23-2016 01:05:45am
Checksum: 87651C4061468B921D49A163597C6F2
Filepath: includes/functions.cpp
Lines of code: 5

Description: 

***********************************************************/

The checksum is a new addition I just added, it's basically used to determine if the file changed, so it does not touch it if it did not. It used to go by lines of code before, but that was not really that great of a way to determine.

I do want to look into make and all that stuff but the more I read on it the more involved it sounds, it just looks like it's pretty much a whole other complex skillset to learn, when I could be spending more time learning actual coding instead. I'll want to learn it when I get more serious about releasing programs to the public though as the "./configure, make, make install" way is what is pretty much universally used.
 

Merad

Platinum Member
May 31, 2010
2,586
19
81
Ok well this vered off topic as originally was about file structure more than what tools are used to compile.

The two are directly connected. To quote your first reply on the thread:

I'm talking more about how to structure it so that when you compile the main program all the files needed are also included. Ex: having a master include file that includes all the .h and .cpp files etc.

You're having to deal with this precisely because you aren't using a proper build system.
 

Crusty

Lifer
Sep 30, 2001
12,684
2
81
Ok well this vered off topic as originally was about file structure more than what tools are used to compile. But figured I'd update, so what I ended doing is going with the original recommended structure I found online, and that's to have every class, function etc in it's own .h and .cpp file respectively. The .h has the prototype of the class/function and the cpp file has the actual code that goes with it. In some cases I might cheat a bit and have a "functions.h" and "functions.cpp" that will just have a bunch of basic functions instead of splitting them up. Each .cpp file includes it's own .h file, and potentially any other .h that may be required but typically forward declaration can solve that so I expect the amount of extra includes to be small. Now, I'm not sure if I should also include all the stuff like iostream in those, or leave that to the main program, I think I will leave those out as it's a given they will be included in the main program. Though if I was to include all that stuff, it would make it possible to compile any .cpp file on it's own.

As far as bringing all the .cpp files together so the main program sees them when I compile, I will use my inventory app which simply goes through my "includes" folder and generates a sources.h file that includes all the .cpp files. I used to also have a includes.h that included all the .h, but I wont need that anymore. The program still has an option to do it but it's off by default. I then include the .h file in my main program below the system includes.

Eventually I will still look into make scripts that have recursive compiling, but I use my inventory program anyway, as it takes care of tagging each file with a header with author and other basic stats like lines of code so either way I'd probably be using my program. This program just kills two birds with one stone basically. It creates headers similar to this:

Code:
/**********************************************************
PostSort C++ source file
No copyright - feel free to distribute, modify etc...
Last modified by Red Squirrel on Jan-23-2016 01:05:45am
Checksum: 87651C4061468B921D49A163597C6F2
Filepath: includes/functions.cpp
Lines of code: 5

Description: 

***********************************************************/

The checksum is a new addition I just added, it's basically used to determine if the file changed, so it does not touch it if it did not. It used to go by lines of code before, but that was not really that great of a way to determine.

I do want to look into make and all that stuff but the more I read on it the more involved it sounds, it just looks like it's pretty much a whole other complex skillset to learn, when I could be spending more time learning actual coding instead. I'll want to learn it when I get more serious about releasing programs to the public though as the "./configure, make, make install" way is what is pretty much universally used.

I'm sorry but you are still doing this wrong. Not only are you doing it wrong, but you won't listen to anyone and now you are starting to re-implement features of Make.

I guess if you want create a new build system be my guest, but that's not what this thread was about. You asked for help on compiling a C++ program and everyone here has given you great advice, but you ignored it all.
 

Essence_of_War

Platinum Member
Feb 21, 2013
2,650
4
81
You're having to deal with this precisely because you aren't using a proper build system.

100% agree.

now you are starting to re-implement features of Make.

"Those that do not understand UNIX are doomed to reinvent it poorly"

You're trying to implement functionality of make by-hand while simultaneously claiming that make is too time consuming to learn.

Part of programming is having a build system. I understand that if you're a person with limited free time to spend on this that you don't want to get sidetracked, but in your effort to avoid wasting time, you're making some penny wise pound foolish decisions.
 

postmortemIA

Diamond Member
Jul 11, 2006
7,721
40
91
Not understanding how make works = you won't be appreciated in team projects that use open source dev. environments.

There's decent online manual from the authors. Just reading it for few hours would make a lot of difference. authors recommend to start reading just first portions of every chapter.
 

Red Squirrel

No Lifer
May 24, 2003
70,075
13,528
126
www.anyf.ca
Like I said, I'm still going to eventually learn it and the advice is not ignored, I just don't feel like spending too much time on it now as it seems quite involved and most of the projects I'll be working on are rather small so it's overkill to learn a whole new scripting language just so I can compile a program that's under 10k lines. Once I get into projects that are in the 100's of thousands of lines of code and it gets to a point where compiles are taking like minute or so, then maybe I'll look into make, and actually using it to a good extent such as creating separate targets so I can only compile what I changed and so on. Learning Make for a small program is like using an excavator and reading through the whole operations and safety manual, just to plant a garden.
 

Crusty

Lifer
Sep 30, 2001
12,684
2
81
You're completely missing the point. It's not about complexity or length or code.

If you can't type the command to compile your entire program from memory into your shell you need to be using Make. If you're inclination is to write a shell script to compile your program, you need to be using Make. If you resort to writing a tool that makes a hack of an include file to make sure everything gets compiled, you need to be using Make.
 

Essence_of_War

Platinum Member
Feb 21, 2013
2,650
4
81
You're completely missing the point. It's not about complexity or length or code.

If you can't type the command to compile your entire program from memory into your shell you need to be using Make. If you're inclination is to write a shell script to compile your program, you need to be using Make. If you resort to writing a tool that makes a hack of an include file to make sure everything gets compiled, you need to be using Make.

Thisssss exactly this.

It feels like you're not actually reading the replies here. Make is easy and it scales both up and down. If you don't want to use it, don't use it, but don't pretend like you're saving yourself time and energy by not using it. Especially when your alternative seems to be self-penned shell scripts recursing through directories and compiling based on checksums in comments at the top of your source files.
 
Sep 29, 2004
18,656
67
91
Red,

The way you are thinking about things (in the first half of the discussion anyway) is wrong.

But, if you want to do it that way, you have to be diligent about file protectors (include guard, header guard, whatever you want to call them) in your header files. Then you would need one master cpp file that references every cpp file. Additionally, each cpp file only is to reference the h files it needs. That's the secret juice.

The compiler will essentially treat your code as one giant text file at that point. Think of include "" as a copy/paste of code because that is what it is to the compiler. And if you included the header files multiple times, the compiler will skip all times other than the first time it is included because of the file protectors.

FWIW: There are also ANT extensions for compiling c code. It basically wraps make. make s not hard. You should spend the 4 hours it would take to get the basics down. You probably only need the basics. It literally is along the lines of tell it what files to link. Then tell it what compiled files to link. You can tell it to search for all *.cpp files to, so it's not that difficult to whip together a makefile that is 30 lines long that probably does everything that you want.

And if bored, you can even read up on compiler options! That's where the real fun is.

Honestly, a new college grad that knew how to do this and could prove so in an interview would probably impress me more than the 4.0 student that can't do anything outside of code to save their ass.
 
Last edited:

Red Squirrel

No Lifer
May 24, 2003
70,075
13,528
126
www.anyf.ca
Red,

But, if you want to do it that way, you have to be diligent about file protectors (include guard, header guard, whatever you want to call them) in your header files. Then you would need one master cpp file that references every cpp file. Additionally, each cpp file only is to reference the h files it needs. That's the secret juice.

The compiler will essentially treat your code as one giant text file at that point. Think of include "" as a copy/paste of code because that is what it is to the compiler. And if you included the header files multiple times, the compiler will skip all times other than the first time it is included because of the file protectors.

Yep that's pretty much how I've always understanded it to work. Though I did learn about the "#pragma once" in this thread, instead of ifdefs, so from now on I'll use that instead, it seems like a much cleaner way of doing it.

Typically my command string is

g++ app.cpp app

So I could easily type it each time, but I usually just make a script called rebuild.sh. A make file would essentially be doing the same thing. Though I still want to learn it eventually, just that it was not my goal now. For cross platform programs I'll still want to use my way though as I'll want to be able to compile in Windows too.
 

Crusty

Lifer
Sep 30, 2001
12,684
2
81
Yep that's pretty much how I've always understanded it to work. Though I did learn about the "#pragma once" in this thread, instead of ifdefs, so from now on I'll use that instead, it seems like a much cleaner way of doing it.

Typically my command string is

g++ app.cpp app

So I could easily type it each time, but I usually just make a script called rebuild.sh. A make file would essentially be doing the same thing. Though I still want to learn it eventually, just that it was not my goal now. For cross platform programs I'll still want to use my way though as I'll want to be able to compile in Windows too.

No, a make file is not essentially doing the same thing.

How do you add a new .h/.cpp file to your project? With a proper Makefile you just type 'make' and everything compiles and works.

When you edit one file do you have to compile your entire project? With a proper Makefile issuing the command 'make' will only compile what needs to be compiled.

How do you run your test files? With a proper Makefile you can just run 'make test' and have all your test files run

How do you build your project in debug mode so you can use a debugger properly? With a proper Makefile you would just run 'make debug' and all of the sudden you've got a debug build.

How do you install your binaries once you've got a successful build? With a proper Makefile you would just run 'make install' and all the appropriate files will be copied to their installation location.

If cross platform is your priority then I would suggest learning CMake in addition to Make. First you use CMake to build your platform specific build environment, then you use said environment to build on the local platform. You can generate Visual Studio projects, GNU style Makefiles, and a whole host of other types of output with CMake.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
Learning Make for a small program is like using an excavator and reading through the whole operations and safety manual, just to plant a garden.

True.
The trick is to start small.
Get a Makefile that does just what you want. Nothing fancy. Soon you'll learn to appreciate it. Then later you will find a little thing you'd like to do Make as well. Then you learn one little more detail about Make and implement your one small little addition. And over the years you'll learn more and more.

To get started, all you need is a simple Makefile that compile your code.
Here is a basic version of the Makefile I always use.

Code:
#
# Makefile for proggy
#
# Proggy is the name of this project.
# The name of the executable will be proggy too.
#
#
CC = gcc -g -Wall -Werror
RM = rm -f
DEPEND = gccmakedep -fMakefile

COMPONENT = proggy

HEADERS = proggy.h proggy_private.h proggy_config.h proggy_externdefs.h

DEPENDENCIES = $(HEADERS)

SOURCES = proggy_main.c proggy_util.c proggy_identifiers.c proggy_packet.c proggy_show.o proggy_config.o

OBJECTS = proggy_main.o proggy_util.o proggy_identifiers.o proggy_packet.o proggy_show.o proggy_config.o

all: $(COMPONENT)

$(COMPONENT): $(OBJECTS) $(DEPENDENCIES)
        @echo "Making $@"
        $(CC) $(LDFLAGS) -o $@ $(OBJECTS)


#
#       Handy stuff.
#
#
lint:
        lint -I$(INCLUDE) $(SOURCES)

tags:   $(SOURCES) $(HEADERS)
        etags *.c *.h
        mkid *.c *.h

clean:
        $(RM) $(COMPONENT) *.o core a.out a.exe Makefile.bak *.stackdump

tidy:
        $(RM) *.o core a.out a.exe Makefile.bak *.stackdump *~

depend:
        $(DEPEND) -I$(INCLUDE) $(SOURCES)

#
# Here are the automatically generated dependancies.
#
# DO NOT DELETE
What you need to is:
1) Change the line "COMPONENT = proggy" and give it the name of the executable (or library) you are trying to build.
2) Change the lines for SOURCES HEADERS and OBJECTS. Include you .cpp, .h and .o files.
3) Start with a "make clean".
4) Do a "make depend".
5) Look inside the Makefile. "make depend" should have added lines to the bottom of the Makefile with the dependencies
6) Do a "make". It should compile and link everything.
7) Do a "touch something.c" to update the timestamp of the file. Do a "make" again. See what needs to be done to create a new endresult.

You might need to change a few things. E.g. if you use a C++ compiler in stead of regular gcc. If you want other flags with your compile, change the CC= line. I'm sure some details of my Makefile could be done differently or better. But this is a minimal Makefile that is applicable to most programs.

Now all you need is a simple Makefile for your main directory. Something with a loop that loops through all your sub-directories and does a make in those. Maybe we'll have another volunteer ?
 
Last edited:

Merad

Platinum Member
May 31, 2010
2,586
19
81
Semi OT, but you don't need any additional tools to handle compilation dependencies. Add '-MP -MMD' to the compiler flags when generating object files. For every object file Foo.o gcc will also output a dependencies file Foo.d containing make rules for its dependencies. Then your makefile will need the line '-include $(OBJECTS:.o=.d)' instructing make to look for the *.d files and dynamic join them with the rules in your makefile.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |