I started a job as an appdeveloper for android. After a week they figured that I'm not fast enough getting the apps ready so the fired me and offered me an internship, for free.
I'm wondering how I can get better fast. Should I read more (I'm missing architecture stuff I think)? What should I read? should I just start coding stuff, or should I do a course of some sort?
I'm reading design patterns from head first. Will Java design patterns help me with Android or are they not applicable on android. Basically I'm at home reading android stuff. The company said they would take me back when I'm good enough so I need to get better quick.
Best way to learning ( imho ) is by looking in to already coded project and disecting them. Try to understand how the program is made up. Then work on some tutorials and expand then!
Resources:
Java: http://download.oracle.com/javase/tutorial/java/index.html
Android tutorials: http://p-xr.com
I assume you are the only developer in a very small company. If I'm wrong and there are multiple developers then they should be doing design and architecture and you shouldn't need to worry about that--just code the sections they give you to code.
So the rest of this is based on the assumption that you are the sole developer.
Don't worry about architecture or design or anything--Get a GUI together as fast as you possibly can--it's all about what you can show. Even if it has nothing at all behind it, it's visible progress.
A well architected solution will save you bunches of time in the futre, but from the sound of it you don't have the experience, and also a well architected job is front-loaded.. a lot more time spent before you can show anything. Doesn't sound like that will make your employers happy.
I'd actually spend 2 or 3 hours some evening researching some of the "Project Management and Design portions of XP (you may have to do this on your own time at first--but then if you're getting paid $0 all your time is your own time right now)
XP is all about your process being transparent to your customer (in this case your customer is your employer). This allows them to understand why something might take time and it actually allows them to make decisions on the fly to correct problems or speed the process by eliminating features.
I'd start by writing down each task on a piece of paper. Each of these tasks should take a couple hours to a day or two to do so make them pretty fine-grained but don't write down a time estimate, instead when you are done with the cards go through them and rank them all in difficulty 1-5 (The purpose becomes clear when you see how XP does time estimations).
Then you start interacting with your employeer---they choose which tasks they want done for the day/week. You do them.
As you are executing tasks you may encounter new ones or have to break some up. This is expected--make more sheets of paper. If your employers want to know when it will be ready, look into how to do burndown charts and time prediction. In XP this practice does a good job of taking into account the fact that programmers suck at estimating time (It uses your history and the difficulty 1-5 ranking to determine how long a "2" actually takes you to do.
Soon after implementing this practice, decide on what metrics they want you to maintain in order to get paid, or see if they want to pay you per task.
On the other hand they are only treating you like that because you let them. You're fired now, start looking for another job. If you find one, just walk away.
Best for learning is doing. All reading will get yo nowhere if you don't practice.
And the design pattern may help, but much of the classical wisdom may not directly apply - if only for the fact that the devices have limited memory.
Take the JavaBeans approach, where a bean has to have a getter and setter. On Android (before 2.3 iirc), this had a negative impact on performance, so that making fields public and directly using them is preferred.
To learn, take an open source project (e.g. my Zwitscher app), use it, try to find out what you want to improve and improve it.
Firstly, make sure that your employer was justified in his reasons for firing you. Some of the questions you should be asking them are:
- Were the deadlines not respected? In this case, if you did give them timelines for completion of phases of your work, then you may need to look at improving your effort estimation. If they gave you timelines, see if they were reasonable
- Were features added to your work on an ad-hoc basis? Many times I've found that a freeze on requirements is consistently broken
- If you are slow and haven't picked up some of the basics, then come up with a sample application that closely resembles your employer's application, and DO it. Use SO, developer docs and books to clear doubts and hone your skills.
Best of Luck with Programming :)
Related
Some background.
I am currently working on a distributed system in Java. The code is mostly all hand rolled, and consists of a client server frame work and an application to be served. The application is for running a turn based system.
The design issues.
The application is supposed to be almost completely configurable. The base of the program is a processor that will process objects in a linked list style, moving from one object in the chain to the next until the processing is done. I have already written a simple blackjack game with it, nothing fancy just something to implement the API. The problem i am having is that i am unsure as to the granularity of the objects. I know i do not want to write a programing language style architecture, example, classes to represent add, subtract, and other languages constructs, but i want the architecture to be flexible enough so that a user can specify a program, within its capabilities, in a file and have this construct instantiate the proper classes link them and then run the sudo-program.
Any suggestions and/or ideas would be helpful.
Thanks in advance,
Zix
This kind of thing is pretty tough, mostly because the requirements are so broad. When we look at successful, real-world platforms that are highly flexible and configurable it's too easy to think they they started that way, or that you could design them from the beginning to be that way.
This is not the case.
The reality is more about picking a subset of problems. In your case, not just a turn-based games in general, or even turn-based card games, but blackjack, followed by some other game (bridge, poker, whatever), and then you build upon that.
Let your application (and its features) grow based on actual needs, and not the set of problems you think you will need to solve. Focus on the problems that are in front of you as you try to build actual games. Over time, you'll figure which parts to abstract (and which parts not to abstract), and in the mean time you'll actually have something to show for it.
Go the other way, and you'll have an interesting idea, an API no one uses to solve actual problems, and a bunch of code but no games. Don't fool yourself into thinking that no one has tried to build the platform for turn-based games. It's been tried 1,000 times. The reason it doesn't really exist is because building platforms without clear goals (i.e. a set of requirements leading to a finished game) leads you to software without a clear use.
That means they don't get used, games don't get built, you never hear about them, and therefore they aren't successful. Successful, flexible platforms have a long list of successful applications built using them, at many stages throughout their lifetime.
Is it wise to develop a prototype GUI before designing other part of the system?
I am using Java for this small project. It will be a program with GUI and database connection. Say the database has table A and B, the user can choose which table to interact with. The program then display the contents of, say, table A in the GUI, and allows the user to change the content and submit the changes, or delete, or insert.
I think GUI should be developed first before any back-end development starts. There are couple of reason to do this:
You gain clarity on how model objects should interact.
Usability poses lots of restrictions on the way you want to pull data. You will probably want to develop and architect after you're 100% sure what constraints are there.
On business point, managers like to have a dumb function UI before any development start. Many times, the feedback leads in major changes in back-end assumptions. Which is a lot less pain than the case when you get a change request after the back-end development is over.
My personal experience goes that simultaneous development of GUI and back-end is a bit messy. Plus GUI provides solid expectation of behavior from back-end. Moreover, this approach makes sure all the developers, your client and your manager on the same page.
I agree with Joel Spolsky that it is a great idea to write a functional spec before writing code. Part of that spec should include a collection of screen mockups. #O.D. is right, Balsamiq is a great tool. It has saved me a lot of time in the past.
Once you have a functional spec in place that the business users are happy with, you will then have a better idea of how to design your system to meet the requirements. e.g. is high performance a requirement, domain model vs simple crud etc.
Then you should start by taking a single use case and building a vertical slice of your application. Build a GUI, service layer, persistence layer, database schema in one iteration. This will hopefully point out any problems with your design and give you the chance to modify it before you start building out the horizontal functionality.
I'd say yes and no.
No because you should design you application to be modularized enough so that your logic and data do not depend on UI design.
Yes because it is always smart to design everything before you actually start implementing it.
So what I mean is that you should make a concept, but not let your UI concept 'tie your hands' when you implement your logic. So if your managers clients don't like your conceptual UI, you can always change it without actually changing your application logic.
Well showing you GUI brfore starting to program is a very a good Idea, specially that you enable the enduser (Customer) to check if the UI is up to his expectations, which can save you lots of time.
In order to do that you dont necessarily need to develope a "real" prototype, you can use programms which enable you to fast design the UI of your App, including a minimal workflow simulation instead of full funcionality.
i had a very good experience with: Balsamiq can really recommend it
Writing spec before your code is always a good idea, because it makes you think. But most specs I have seen are not that good. And if the spec is too technical, users will at the end sign-off your spec without really understanding what are they going to get.
I have seen best results when either presenting the User Manual to the client, or by discussing mockups of the system one scenario at a time.
Note that half-baked mockups won't do the trick. You need your mockups to be fully populated with relevant data (Ever tried to discuss some screens with accounting while the numbers on the screen don't match? There's no way at all you could explain to them these are only dummy numbers...)
And the caveat of using mockups is that users will more often than not believe the app is "almost finished", whatever you do or say. It must be some subconscious thing, I'm not sure. But to avoid that, most of specialized tools have either only "black&white" look and feel or multiple skins you can switch to and from.
There is a pretty complete list of mockup tools here. Many of them are free:
http://c2.com/cgi/wiki?GuiPrototypingTools
My own tool is pretty popular: http://MockupScreens.com, I created it a long time ago exactly because of my own frustration with above mentioned problems.
Obviously, it's easier to do with some developers, but I'm sure many of us are on teams that prefer the status quo.
You know the type. You see some benefit in a piece of new technology and they prefer the tried and true methods.
Try, for example, DBA/C# programmer the advantages of using LinQ ( not necessarily LinQ to SQL, just LinQ in general ).
For example, When a project requirement is to be cross-platform... instead of thinking about how one can run Windows on a Mac through a VM Machine, introducing the idea of using relatively new Silverlight or creating it in Java ( as an option to look into ).
I know most people don't like to be out of their comfort level, so it takes a bit of convincing, and not ALL new technology makes business sense... but how have you convinced your team to look at a new technology?
What technologies have you successfully introduced to your workplace?
What technologies do you think are hardest to introduce? ( I'm thinking paradigm-shifting ones, like MVC from WebForms... or new languages )
What strategies do you employ to make these new technologies appealing?
Know the technology well before pitching it. You're going to get questions like "but how can we make it do X?", and you want to be able to give at least a general answer.
Try not to be a religious zealot. Acknowledging that the new technology is not perfect, that it's just another tool in the toolbox, goes a long way towards credibility.
Give a well-prepared live demo to show what it can do. For example, a friend of mine built a simple blog in Ruby on Rails in half an hour, in front of a live audience. I want to stress the word "well-prepared"; if things keep breaking along the way, or you don't fully understand what you're doing, or you are unable to answer basic questions, you'll hurt your cause rather than help it.
When it comes to coding practices my favorite is to quite simply use examples. I will take a few hours and edit our code base to use the new technique in place of the previous pattern. Then send a shelveset or changelist around to the rest of the developer list displaying the difference. Or just have a meeting to talk about the difference.
Showing examples in real production code really helps other developers see the advantages.
I've successfully introduced LINQ to my company and it has helped quite a bit.
What worked for me? Show and tell. Our previous technology was database programming with C, which is quite the mess. Our lead developer did about 3000 lines of code to fill a dataset, and I did it in a 10th of that with LINQ/C#.
Once I broke down what I did and he saw how powerful it was, he was convinced it was time to upgrade.
The advice from people who convinced the management to consider using F# goes something like this:
Implement the most important bits of the next key project of the company in F# in your free time and then show others what benefits it has, how quickly you were able to implement it and how easy it is to adapt the solution to changing requirements.
I think this is quite effective way - when people actually see the productivity (of any new technology), it is much easier to convince them that it is worht learning it.
It's best to lead by example. Complete a successful project using the new tool and wait for developers to ask how you did it.
I managed to convince the team Im part of to switch from CVS to Mercurial. Can you believe we were still using CVS? Neither could I when I started.
I became almost somewhat of a preacher, a royal pain in the butt. Everytime CVS screw up or caused some sort of discomfort (being brutally slow, for example) I held a little speech of how much better it could be.
Soon enough they accepted the possibility that there were alternatives (none of them actually knew there were alternatives to CVS!) and began to say things like "if there indeed are alternatives, anything must be better than this".
Thats when I made my move and simply ran some scripts converting the CVS repository into a Mercurial one and uploaded it to the company server. Once they saw it in action, they were sold.
Not that I planned anything during this little migration black ops, but in retrospect I would give the following advice to anyone attempting something similar:
Let people know there are (better) alternatives, it is entirely possible to work outside your comfort zone.
Lead by example, if you want something done, do it yourself. Show the alternative in action. No one is going to make the jump unless you jump first, especially not if they are already hesitating.
Show them how it solves a common problem. Pick out some problem that appears regularly for them and show them the solution. That will usually at least get them thinking about it.
Stand the two technologies next to one another, its safe to assume that advancements have been made and that what you are bringing to the table is better suited to the job at hand.
Put the raw results in front of them and let them decide for them selves!
I work for a data beauro, and until recently the company was hooked on MS Access, which was cumbersom and unfit for the job, after some serious convincing and showing the power of SQL in comparison to Access, its now the weapon of choice.
And it took standing the two techs side by side and allowing the guys up top to see for themselves, the time saved did make business sense!
You'll need to show why it's a BETTER technology (or at least better at something) than the current tool/method in use, and probably significantly so. Otherwise, why go through the effort of learning something new?
Otherwise, convince the boss and then get a mandate ... (though I don't really recommend that if you can't get at least half the team on board).
My question is broader than merely expanding language specific skills. I'm wondering what sort of advice anyone might offer to someone with a desire for knowledge, but a lack of direction. Feel free to skip down to the question.
Background: I've taken a single computer science course in java at my University, but i'm planning on majoring in Computer Science and Electrical Computer Engineering (ECE). My java class was quite rudimentary. We were provided with a framework and merely edited/created a few methods and classes to perform a simple task. We made a version of Breakout, and created an AI for a simple connect-four game. Although I'm somewhat familiar with big O notation, I haven't actually studied it in class (My next CS class covers this). In my introductory ECE course we also programmed BOE-Bots in PBASIC to compete in an obstacle course of sorts. As an engineering student, I was also introduced to matlab, although I've only used it for linear algebra homework. In summary, I don't have much programming background, but I've loved pretty much everything I've done so far, and I'm looking for ways to build a more valuable skill set.
Steps Taken: After paying more attention to the programming section of reddit, I found a link to Project Euler. This summer I've been hacking away at those problems (finished my 42nd solution yesterday), but I'm looking for some more direction. I've been reading blog entries and SO heavily. I'd like to learn something about php, and perhaps create a dynamic webpage, but fundamentally I want to do anything in my power do to improve myself and prepare for the working world.
Question: What would direction would you recommend for me? Should I learn a new language? Keep attacking Project Euler? Read some books on programming? Start a coding project(I wouldn't even know where to begin)? Wait until school? Learn about more fundamental programming principles?
I guess with all the paths available, I'm just a little overwhelmed, and I don't want to fall into a path that might be detrimental to my career opportunities. I'm also really bad at making decisions. Your advice would be greatly appreciated!
The word quickly worries me... I suggest you read Teach Yourself Programming in Ten Years - Why is everyone in such a rush?
~~ Peter Norvig
Forgive yourself if you're not setting the world on fire three weeks after sitting down at the keyboard... maybe you're destined to be a late bloomer?
;-)
Most everyone is suggesting doing more programming. Whilst that's obviously important, I think it's important to understand the environments that your programs work in.
I would investigate how operating system work (how they allocate resources for you, how they schedule programs and threads), how networks work (routing, TCP/UDP behaviours and limitations etc.), and some database theory.
I see a lot of junior programmers with great language knowledge, but beyond the sandbox of their program they don't have a clue how the computer/network enables their stuff to run.
Knowing something of the above will make you a better programmer in terms of how you write your programs, and how you understand how they'll work (and indeed, how to debug them or analyse their failures)
There is only one thing that can improve your skills as a programmer: Program.
Reading books about programming is akin to reading books about cycling. It's not going to turn you into a cyclist.
Program something that you'll use and have a vested interest in. Then just put your head down and do it - reading any supporting information as you go.
Programming skills are just a small part of what makes a successful programmer, IMHO. Being able to read specifications, ask questions and interact with other people to extract the information you need to program are very important too. Pay attention to your 'soft' skills, they will help you a lot in being a successful programmer in a commercial setting.
I'd reccomend you take a multi-pronged approach. Reading books can only take you so far, since they don't provide you with practical experience in developing an application from scratch. Programming is much more now about writing code; you have to be able to design applications, read documentation, and know how to solve the needs of a customer who doesn't know how to tell you what to write.
If you're still in college, try to find an internship with a development firm. You'll get access to people and resources who already have this experience. If you for some reason can't do an internship, find and make a friend who is a developer. The impact of having someone who knows what they're doing to bounce ideas off of is immeasurable.
In addition to surrounding yourself with people with knowledge, you should also take on writing an application all your own. Pick something that interests you, but at the same time try not to make it too complicated, you don't want to frustrate yourself by writing a boring application, or choosing something that has too steep of a learning curve. Remember that the application is at it's core, a tool to assist you in learning, so it's OK if it turns out to be less complex then you imagined, or if the code you're practicing doesn't apply within the scope of what you're trying to write.
You definitely need to write code in order to become a better programmer. If you don't have a particular idea for a program to write, pick an open-source program (ideally one that you use), contact them and find out what they need done; contribute something. Fix a bug, implement a feature... start small and work your way up. This will help you learn about working on a team, writing maintainable code, preventing problems, and working on useful tasks. Much of what people write while learning to program is just exercise code, and not actually useful or representative of anything realistic. That is why a real, useful program is the best thing to do.
I saw an interesting link for a site called CodeKata that might help you. The important thing to get your skills going quickly is to actually code for practical problems that you will be likely to encounter in any real job.
Besides picking an interesting project and implementing it from scratch, reading others' code can also be valuable in improving your skills. Scott Hanselman has a good series on reading code: Weekly Source Code
Since you already know the fundamentals of programming, and obviously have the desire to learn I would start a pet project.
Nothing has forced me to learn great deals about a technology, language or idea more than having a fairly challenging pet project to throw my self into.
I would recommend, like rein mentioned, picking a project you really have a true interest in. Other wise it will get boring pretty fast and you'll have no idea why you are working on it. I think it would be wise to make the project fairly non-trivial. I always found the less trivial a project I was working on, the more I learned and took away from it.
Also If you live near a college/university research labs on campus are pretty much always looking for summer interns to work on code in one shape or form. Some will even pay you. I did this for 4-5 years through high school and some university and it was a great learning experience.
Obviously participating in an open source project is a great way to gain experience at working with other developers. Check out stuff on github, sourceforge, bitbucket.
I'd only caution you to ask for help and bother the core developers of a project you are very interested in and you are going to stick with. Generally open source developers dedicated to there project are few and far between. Their time is really valuable to the community and to waste it on someone who is just going to fly by night and never be seen again is kind of a shame.
It sounds like you are writing programs happily, and are a decent java programmer. So writing more might not be your most productive way of learning new things.
As a result, I'd say learn a new language - knowing more than 1 always exposes you to new concepts. As a java programmer I'd suggest C/C++ as they've immensely popular but they fill areas that the 'managed high level' languages don't go, the low-level high-performance, to-the-metal applications. If you learn C# (for example) you'll find you're just doing a different dialect of java (roughly) and you won't learn too much.
If you don't like the idea of C/C++, get into a scripting language instead - bash, VBScript/WMI, or perl or python. they'll be sufficiently different to teach you some good stuff.
Or do both!
You already know how to program; in order to prepare yourself for a real life job, the most important thing is probably not how to program better, but to learn the things you'll have to do that aren't programming.
In your workplace, you most likely won't be asked to write line "fire and forget" programs, but work in a team on large applications. In order to do that effectively, you need to learn:
How to communicate effectively with analysts, customers, managers and your fellow programmers
How to plan and coordinate work: using source control systems and bug tracking systems, working with written specifications, following a development process - "software engineering"
Perhaps most importantly: get used to working with other people's code - third party libraries, frameworks, and of course colleagues' code.
This, too, can most effectively be learned through hands-on experience in a real project. For me, the big eye-opener was working in a group of about 10 students in an experimental project organized by university with some industry support. Perhaps you can find or organize something similar at your university? If it has a chair dealing with software engineering, they should be happy to do something like that if they aren't already. Failing that, joining an open source project is good too.
Write programs to automate things that you do manually on a daily basis.
Like, todo lists, keeping accounts, checking RSS feeds.. the list could be endless. If you do some business on the side, write some simple code to assist you in that.
The essence is: to improve your programming, do programing! :D
Contributing to open source projects has a surprisingly high impact on your ability to develop software, as opposed to the ability of writing code:
you get to work on code written by others, which is sometimes brilliant and sometimes abysmal;
collaborate within a geographically distributed team;
get introduced to team politics ( yay! );
understand what it means to have actual users.
to quote just a few. Some good places to look:
Codehaus
Sourceforge
Google code
Find out about best practices (for example training videos on windowsclient.net). Get any problem and solve it using database as input and database as output. Start over several times. That's as close as you can get to the real world scenario. Read books recommended by Joel.
Here's my multi-prong suggestion:
1) Big picture work - Look at various Software Development methodologies and see how when you program you follow one of these approaches. You did some Java work, was that web, windows application, console application, or something else? Learning a little about the other types may also prove useful
2) Medium picture work - Have you looked at Refactoring and Design Patterns? These can be very useful but may require a bit more coding experience to see how they can be useful.
3) Small picture work - Study various algorithms and understand various trade offs one can make in choosing various implementations of a data structure, e.g. linked lists vs. arrays. There is a big white book about Algorithms that some use when studying them.
My multi-advise:
Keep training, and write code. Participate in small open source project.
Read standard book (here is the Jeff's list)
Learn from your mistakes, or better from the mistakes of others by reading site like www.badprogramming.com
From 99 ways to become a better developer:
Program! and try to diversify your work as much as possible
Find a mentor
Be a mentor
Learn about the different aspect of software development other than code (business of software etc.)
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
It's been mentioned to me that I'll be the sole developer behind a large new system. Among other things I'll be designing a UI and database schema.
I'm sure I'll receive some guidance, but I'd like to be able to knock their socks off. What can I do in the meantime to prepare, and what will I need to keep in mind when I sit down at my computer with the spec?
A few things to keep in mind: I'm a college student at my first real programming job. I'll be using Java. We already have SCM set up with automated testing, etc...so tools are not an issue.
Do you know much about OOP? If so, look into Spring and Hibernate to keep your implementation clean and orthogonal. If you get that, you should find TDD a good way to keep your design compact and lean, especially since you have "automated testing" up and running.
UPDATE:
Looking at the first slew of answers, I couldn't disagree more. Particularly in the Java space, you should find plenty of mentors/resources on working out your application with Objects, not a database-centric approach. Database design is typically the first step for Microsoft folks (which I do daily, but am in a recovery program, er, Alt.Net). If you keep the focus on what you need to deliver to a customer and let your ORM figure out how to persist your objects, your design should be better.
This sounds very much like my first job. Straight out of university, I was asked to design the database and business logic layer, while other people would take care of the UI. Meanwhile the boss was looking over my shoulder, unwilling to let go of what used to be his baby and was now mine, and poking his finger in it. Three years later, developers were fleeing the company and we were still X months away from actually selling anything.
The big mistake was in being too ambitious. If this is your first job, you will make mistakes and you will need to change how things work long after you've written them. We had all sorts of features that made the system more complicated than it needed to be, both on the database level and in the API that it presented to other developers. In the end, the whole thing was just far too complicated to support all at once and just died.
So my advice:
If you're not sure about taking on such a big job single-handed, don't. Tell your employers, and get them to find or hire somebody for you to work with who can help you out. If people need to be added to the project, then it should be done near the start rather than after stuff starts going wrong.
Think very carefully about what the product is for, and to boil it down to the simplest set of requirements you can think of. If the people giving you the spec aren't technical, try to see past what they've written to what will actually work and make money. Talk to customers and salespeople, and understand the market.
There's no shame in admitting you're wrong. If it turns out that the entire system needs to be rewritten, because you made some mistake in your first version, then it's better to admit this as soon as possible so you can get to it. Correspondingly, don't try to make an architecture that can anticipate every possible contingency in your first version, because you don't know what every contingency is and will just get it wrong. Write once with an eye to throwing away and starting again - you may not have to, the first version may be fine, but admit it if you do.
I also disagree about starting with the database. The DB is simply an artifact of how your business objects are persisted. I don't know of an equivalent in Java, but .Net has stellar tools such as SubSonic that allow your DB design to stay fluid as you iterate through your business objects design. I'd say first and foremost (even before deciding on what technologies to introduce) focus on the process and identify your nouns and verbs ... then build out from those abstractions. Hey, it really does work in the "real world", just like OOP 101 taught you!
Before you start coding, plan out your database schema - everything else will flow from that. Getting the database reasonably correct early on will save you time and headaches later.
The main thing is being able to abstract the complexity of the system so that you don't get bogged down by it as soon as you start off.
First read the spec like a story (skimming through it). Don't stop at every requirement to analyze it right there and then. This will allow you to get an overall picture of the system without too many details. At this point you would start identifying the major functional components of the system. Start putting these down (use a mindmap tool if you like).
Then take each component and start exploding it (and tying each detail with requirements in the spec document). Do this for all components, till you have covered all requirements.
Now, you should start looking at relationships between the components, and whether there are repetitions of features or functions across the various components (which you can then pull out to create utility components, or such). Around now, you would have a good detailed map of your requirements in your mind.
NOW, you should think of designing the database, ER diagrams, Class Design, DFDs, deployment, etc.
The problem with doing the last step first is that you can get bogged down in the complexity of your system without really gaining an overall understanding in the first place.
I do it the other way around. I find that doing it database-schema-first gets the system stuck in a data-driven-design that is difficult to abstract from persistence. We try to do domain model designs first and then base the database schema on those.
And then there's the infrastructure design: the team should settle on conventions on how to structure the program first and foremost. And then we work together to agree first on a design for the common functionality of the system (e.g., things everyone needs like persistence, logging, etc.). This becomes the framework of the system.
We all work on that together first before we split the rest of the functionalities amongst ourselves.
It has been my experience that Java applications (.NET also) that consider the database last are highly likely to perform poorly when placed into a corporate environment. You need to really think about your audience. You didn't say if it was a web app or not. Either way the infrastructure that you are implementing on is important when considering how you handle your data.
No matter what methodology you consider, how you get and save your data and it's impact on performance should be right up there as one of your #1 priorities.
I'd suggest thinking about how this application will be used. How will future users work with it? I'm sure you know at least a few things about what this application needs to handle, but my first advice is "think of the user and what he or she needs".
Draw it up on plain paper, thinking of where to section off the code. Remeber not to mix logic with GUI code (common error). This way you will be set to extend your applications reach in the future to servlets and/or applets or whatever platform comes along. Section in layers so that you can respond to large changes faster without rebuilding everything. Layers should not see any other layers than their closest neighbouring layers.
Begin with true core functionallity. All that time consuming fluff (that will make your project 4 weeks late), wont matter much to the wast majority of users. It can be added later once you are sure you can deliver on time.
Btw. Even though this has nothing to do with design I'd just like to say that you won't deliver on time. Make a realistic estimate on time consumption and then double it :-) I assume here that you will not be alone in this project and that people will come and go as the project progresses. You may need to train people midway through the project, people go on holiday / need surgery etc.
Split the big system to smaller pieces.
And don't think that it's so complex, because it usually isn't. By thinking too complex it just ruins your thoughts and eventually the design. Some point you just realize that you could do the same thing easier, and then you redesign it.
Atleast this has been my major mistake in designing.
Keep it simple!
I found very insightful ideas about starting a new large project, based on
common good practices
Test Driven Development
and pragmatic approach
in the book Growing Object-Oriented Software, Guided by Tests.
It is still under development, but first 3 chapters may be what You are looking for and IMHO worth reading.