The reason for the importance of input data errors to fitting is that normally the single measurements aren't all of the same quality, so they shouldn't have the same importance in determining the results. That's one major reason for dividing the differences between data and function by the input errors, also known as 'weighting', in the computation of chisquared.
By weighting, deviations from your function at places where the data have large errors will have a smaller part in chisquared, as the division will make them smaller compared to the better measurements. Another reason for the division is that, for mathematical reasons, chisquared has to be a dimensionless variable, i.e. chisquared should be something like '15.3', not '15.3 square seconds'.
Without input data errors being given, all data will be weighted equally, and the resulting errors of the parameters won't have much of a real meaning. Therefore, you should always try to find a sensible set of y-errors for your data. An important example is that of data representing a histogram. In such a case, the square root of the y value is often the correct input error to use.
Once the fit iteration has stopped, it will display a load of valuable information which you will have to learn to interpret before you can use it. The 'sum of squares residuals' is the distance between the data and your fit function, shortly called 'chisquared'. This is what fit tries to minimize. To quickly test if your fit went well, check that this is about the same as the number of data points minus the number of parameters (all this is only valid if you supplied y-errors, and the number of data points is large enough). For details on this, look up the 'Chi-squared distribution' in your favourite statistics textbook.
If chisquared is much larger than that, then your function didn't fit the data very well. Try another, more general one, or allow more of the parameters to be adjusted by fit. Another possible reason could be that the y-errors you supplied were a bit optimistic, i.e. too small.
If, on the other hand, chisquared is too small, then the function fit the data too well. Either the given y-errors were too large, or the function is too general. You should try to restrict it by either fixing some parameters, or just make it simpler one way or the other.
If all else went well, you'll see a list of the resulting parameter values, together with estimates of the errors of these values. And you should always be aware of this: they're _estimates_, not more. You'll have to get used to both fit and kind of problems you usually apply it to before you can use these errors for anything serious. To start with, the errors reported by fit are insensitive to the global scale of the y-errors, i.e. if you multiply all y-errors by a constant, the resulting parameter errors don't change.
And, to repeat this once more: if you didn't supply y-errors, the parameter errors will normally be meaningless.