In this article Jeff Atwood talks about how to become better at designing and writing software by designing and writing a lot of software. His is stated more elegantly, but it is a valid point. The more you do something, the better you will get at it.
Any modern PC/Mac hardware should suffice. If you plan to run Windows or Linux I would use a PC over Mac. There is a lot of clamor over which is better, but use the one you like the best.
It should be a moot point in this day and age, but make sure you have some sort of reliable internet connectivity (cable, dsl, whatever...). Then you will have access to Google and
stackoverflow; both good resources for programmers.
Make sure you have a keyboard and mouse which are comfortable to you. This includes setting up your desk and chair to accommodate your height and hand position. You will be at the computer for long stretches of time, and you want to be comfortable.
Choose an editor: Vim, EMACS, KATE, Eclipse, whatever. It doesn't really matter which one, but whichever one you do pick learn it well. The editor is your main tool and you want to be comfortable and knowledgeable when using it. The better you know your editor the faster you can create/edit code.
It helps to have an editor that runs on all platforms you may be developing on, but it is not necessary.
At some point you will find your self face to face with having to create or fix a build stystem. Make is pretty standard for *nix and C/C++, but for your own personal projects find the one that suits you best. There are a lot to choose from: Scons, Ant, Make, Jam, ...
I personally use SCons, since it is python based, and I like python.
When learning a new topic I would recommend getting a good book on it. This will generally give you a good overview of what you are getting into, and give you a good foundation to learn from. Google and Stackoverflow are good for specific questions, but a general overview of a topic is harder to get.
This of course assumes you have the luxury of time and money. For the monetarily constrained you can often find free versions of electronic books online.
I used to have strong feelings about which languages to learn, but now I realize you should write in the language you enjoy most. However don't be afraid to try new languages. I personally like C++, python, and C# in no particular order.
Since you are just starting out pick the languages which you can get for free, which I actually think is most languages these days.
In the business world the language of choice tends to fluctuate on about a 5-7 year cycle. However you can find a job (at least currently) in all the "big" languages (C++, JAVA, C# VB.net, python, ruby, perl, ...). If you learn one of the modern languages well, it is typically not a problem to transition to another language quickly. The libraries tend to take longer to learn than the language itself. So pick a language you like learning in and learn it.
As Marc Charbonneau said set up source control. There are plenty of free source control offerings, so pick the one you like best. Personally I use Perforce, which is free for two or less people. I ahve heard good things about Subversion and git as well. The specific one is not as important, but choose one of them.
If you want to get a deeper knowledge of computing I would recommend Sipser's Book and Knuth.
Whichever language you choose I would spend time learning the debugger for it as well.
If you are doing web development, then make sure you know how to minimally setup and run Apachie (or IIS).
Avoid holy wars if you can. They are a waste of time, and you don't learn anything from it except that people are stubborn. Some Holy war topics (bracket style, editors, endianess, "best" language, "best" OS, ...).
My personal setup:
Standard PC (Windows XP Pro)
Standard PC (FreeBSD runs headless: no GUI)
Your first recurrence relation is normally used to describe running time of divide-and-conquer algorithms. a here shows how many parts you are dividing your data to, 1/b shows what piece of original data is used in each part, and f(n) shows how much time you need on each "level". For example, in QuickSort algorithm you divide your data (array or list) into 2 parts, each of which is exactly half (1/2) of original data, and on each "level" of dividing you need to go through all n elements 1 time. So recurrence relation is
T(n) = 2T(n/2) + n
(which evaluates to O(n * lg(n)))
And in binary search you divide data into 2 parts, but take only 1 of them, and time on each "level" is constant, so relation is:
T(n) = T(n/2) + 1
However, your function in C doesn't follow divide-and-conquer strategy. Instead it demonstrates mathematical induction. You define start conditions: if l equals NULL, then length is 0, and if it l->next equals NULL (you also define condition for 1 element in list). Then you define a kind of inductive step - you define how to compute function for nth element, if you know function value for (n - 1)th element. So, knowing value of a function for 1st element you can apply this rule and get value for 2nd element, then for 3rd and so on.
So you can see that induction method is recurrence algorithm by the nature. Here we have 2 times to consider. First is a time to compute value at the start point (or end point - it depends on the order you are looking at the list). In your function you just return this value, so it is constant (C1). Second is a time to make 1 step. In your function it is also constant (C2). Intuitively you should see that execution time of this algorithm will be:
T(n) = C1, when n = 1
T(n) = T(n - 1) + C2, when n > 1
Thus, unless n = 1, you define time to execute algorithm on n element through time to execute it on n - 1 elements. In BigO notation any constant is defined as 1 and case of 1 element is not considered, so your final recurrence relation is:
n = 1
n - 1
T(n) = T(n - 1) + 1
(which evaluates to T(n) = T(n - 1) + 1 = T(n - 2) + 2 = ... = 1 + n, or O(n))
T(n) = T(n - 1) + 1 = T(n - 2) + 2 = ... = 1 + n
Introduction to Algorithms
The Art of Computer Programming - by Donald Knuth (hard read, but well worth it, not recommended for a first algorithms book)
Concrete Mathemetics - By Donald Knuth (understanding the math behind algorithms)
I don't know if e-book versions are available for these, but if they are...these books will definitely give you the theory behind worst-case, and asymptotic analysis of algorithms.