ArticleS. MicahMartin.
AyeCarumba [add child]

IList, IComparable, ICarumba!


This IConvention is an interesting one. "Let's prefix all interfaces with a capital I." Who thought of this? Why did they think it was a good idea?

Let's say I want to implement the Command pattern in C#. I create the following interface.... and to be true the fathers of C# I use the IConvention.
public interface ICommand
{
void Execute();
}


In my application I can now write Code like this.

public void ExecuteCommands(IList commands)
{
foreach(ICommand command in commands)
command.Execute();
}


Clearly, every object in the IList of commands is an Implementation of ICommand. And ICommand is an Interface because It starts with an I. That's Important to know because If It didn't have the I, It might be an abstract class or even a concrete class. And If that were the case then..... well.... hmmm.... it doesn't really matter. I could call the class ICommand or I could call it Command. From the point of view of the client code, maybe Command it's an interface or maybe it's not. So if it's all the same, I might as well submit to the convention and call it ICommand. Right?

Wrong! Prefixing interfaces with I is a mistake and this is why....

Assume that I really do need a Command abstraction. Should it be an interface or an abstract class? I can't think of any logic to put in the base class and the Dependancy Inversion Principles says interfaces are preferred. So I'll make it an interface and follow the IConvention naming it ICommand. After a while there are a dozen or so implementations of ICommand in the application. New implementations are popping up all the time. One day I realize that the application needs to know whether an ICommand has executed or not. Since this affects all implementations of ICommand I can add another method to the interface:
public interface ICommand
{
void Execute();
bool Executed { get; }
}

However, it's not long before I realize that all the derived classes implement the Executed property with the exact same code. They also need a boolean field. So to avoid duplicate code, I'll use Template Method like so:
public abstract class ICommand
{
private bool hasExecuted = false;

public virtual Execute()
{
PerformExecution();
hasExecuted = true;
}

protected abstract PerformExecution();

public bool Executed
{
get { return hasExecuted; }
}
}

But wait! ICommand is not longer an interface. I can't just leave that I sitting there. Clients will think it's still an interface. So now I need to rename the class to Command without the I. Fortunately, ReSharper will help me there. But now I've got the Command class in a file named ICommand.cs. Renaming the file is a bit more challenging since it has to be changed in source control as well. So I rename the file in the subversion repository, then remove the ICommand.cs file from the Visual Studio project, and finally add the new Command.cs file to the project.... *whew*. That silly I sure causes a good deal of hurt.

I've stumble over this scenario more times than I care to mention. Experience has taught me that prefixing interfaces with an I is a choice that will come back and haunt me. So I don't do it.

Here's my dilema. Maybe you can help me. I've been translating Unclebob's Agile Software Development book into C#. In the code examples I have heeded my experience and created interfaces without the I prefix. Reviewers don't like this. They keep telling me that I need to add the I prefix. I'm torn. I feel in my heart of hearts that using the IConvention does more harm than good. Yet, readers of the book will be familiar with the IConvention and may be confused by examples that don't use it. What's the right thing to do?




 Tue, 18 Oct 2005 12:04:30, Monica, The beauty of word processors
Go without the 'I' and tell your readers/reviewers to embrace change.

As a compromise, have your code examples highlight the interface with a format option that clearly communicates it's the interface you're referring to..How often do you see the ā€˜Iā€™ anyway? Only on declarations/assignments correct? IMO what matters is the context that the interface is used in communicates reason and logic for using an abstracted or interfaced element.
 Mon, 17 Oct 2005 16:37:14, Runar, IConvention and Hungarian Notation are Evil
Micah, sometimes you must compromise, but never on principle. Your gut feeling and your experience are telling you the right thing. Our guide here is a general form of Codd's Information Principle:

"Data should be stored in relations and in no other way." More generally: Encode information in semantics, not in names.

To clarify; The indication that something is an interface should be codified as the relation "is an" between "something" and "Interface". Calling it ISomething only causes you to have to maintain this relation in two places (which, incidentally, violates the OnceAndOnlyOnce[?] principle). And maintenance is more difficult because these semantics cannot be discovered without parsing the type names. Besides, this parser breaks the moment somebody creates a class that violates naming conventions.
 Mon, 17 Oct 2005 14:11:45, Phlip, reviewers
There's always a difference between what reviewers say and what a thought leader (or us;) should do about it. ;-)
 Sat, 8 Oct 2005 07:53:37, Jed, Hungarian Notation
I've always felt that Hungarian notation was nothing more than a "helper" for those languages with loose typing (e.g. Visual Basic, C/C++, and so forth) to help us work around the limitations of the language to prevent subtle (and not so subtle) conversion errors. When using a strongly typed language like C#, Hungarian notation serves no useful purpose. The only instance I see Hungarian notation used in C# is as a visual cue that pure abstract classes (interfaces) support multiple inheritance.

I realize this a recursive argument, but I go back to my original assertion that the only reason Hungarian notation is used here is to work around the limitations of multiple inheritance. As a potential point of contention, how many implementation classes that derive from IEnumerable have nearly duplicate code that iterate through a simple array? I tend to agree that multiple inheritance is rarely needed, but the only choice currently available is to duplicate the code with interfaces in those instances that it is needed.

However, I think the more important argument is that the I prefix is so entrenched at this point that breaking away from "tradition" is likely to cause more confusion. I feel your pain, but I think the real argument is that Visual Studio .NET and Subversion do not play nicely together yet. In other words, the more appropriate answer is better integration between Visual Studio .NET and Subversion to support "painless" renaming of files. Whether this is a new version of Ankh that isn't quite so "broken" or a replacement product that works as well as Tortoise within Visual Studio .NET.
 Sun, 25 Sep 2005 09:02:16, Barry, It's just old IDEs vs. new IDEs
Most modern IDEs (Eclipse, VS.NET) will assist you (class browser, syntax checking) and tell you whether or not you're using an interface or a class. Older IDEs did not. Hence the naming conventions we have now (Hungarian et al.). However given the size of C# libraries and intended audience, it is probably better to stick with convention.
 Wed, 21 Sep 2005 10:44:11, Tom Rossen, Syntax vs. convention
> Using or not using IConvention is just a convention.
> When speaking English I (try to :) ) use articles.
> When writing C# programs I use IConvention.

Let's not confuse C# with a natural language. Articles are part of the syntax of a natural language. Hungarian notation is a convention, one that moves code further away from natural language and therefore from readability.

BTW, "Hungarian notation" strikes me as an ethnic slur. How about "kitchenSinkNotation"?
 Fri, 16 Sep 2005 19:27:12, Alex, AConvention in English.
Answering uncleBob regarding nNouns etc.
There are some languages out there that do not use articles.
Your suggestion
>Let's prefix every noun with an (n), and every nadjective with an n(j), and jevery
>nadverb with a n(y), and jevery nverb with a (v).
could look to them funny without anything else just because of multiple usage of articles a or an which do not add a single bit of extra information to the message.

However it seems to look natural to you.

Using or not using IConvention is just a convention.
When speaking English I (try to :) ) use articles.
When writing C# programs I use IConvention.

Anything that let people better understand each other is Ok.
Sacred Wars don't.
 Fri, 16 Sep 2005 19:22:47, ,
 Tue, 13 Sep 2005 22:05:58, Chaz Haws,
> The "interface" is a kludge.

>> It's not that we "need some multiple inheritance" - what we need is to combine two previously independent sets of behaviors.

Of course you're right, vendors and standards should definitely go with pure abstractions. Doesn't mean that's appropriate for my code. Remember, I'm not questioning the presence of interfaces, I'm questioning the absence of full implementation multiple inheritance.

I've got nothing against an "interface". But I'd like them a lot better if I could migrate them to abstract base classes without breaking multiple inheritance. That's often critical to reduce duplication.

If I could do that, arguably there would be no perceived need to visually distinguish them with the I. And suddenly we're back on topic!
 Tue, 13 Sep 2005 12:41:12, ,
> The "interface" is a kludge.

Whoa! I resemble that remark. An interface is a pure contract/specification. Its value may be more obvious in Java than in C#, because Sun encourages multiple vendors to provide implementations. An abstract class is a pragmatic partial generalization. It's not that we "need some multiple inheritance" - what we need is to combine two previously independent sets of behaviors.

> I know how to build ugly structures with interfaces and composition around the lack of something basic like inheritance, but the problem is I shouldn't need to.

They don't need to be ugly. The same desired behaviors and properties can be encapsulated in the same number of classes - they just won't be base classes.
 Mon, 12 Sep 2005 17:48:40, Chaz Haws, RE: Single-inheritance rant
Tom, my own take on multiple inheritance is that it is rarely needed but shouldn't have been removed from either Java or .NET.

The "interface" is a kludge. "We need some multiple inheritance, but that's hard and you folks can't be trusted with it, so we're going to make up a new special kind of abstract class and let you use it with that."

Well, to be honest that's worked pretty well. But it's also true that this whole thread is caused by the artificial distinction between an interface and an abstract base class. When artificial distinctions start getting in the way, it makes me want *real* multiple inheritance back.
 Mon, 12 Sep 2005 17:21:32, Willem Bogaerts, abstractness
> If I'm a derived class, I need to know what I can use from my base classes
> and what I have to define, don't I? This seems fundamentally different from
> "outsiders" that don't care whether method X is implemented by me or an
> ancestor. I'm NOT an outsider. I'm very much in the family.

No you don't. If you are a derived class, you ARE a form of the superclass. So you're everything but an outsider. Like Constable John Doe IS a policeman. And if you do need to know, you still don't know unless you mark the abstract methods, not the class itself, with an 'I'. An interface just is a purely abstract class.

I know how to build ugly structures with interfaces and composition around the lack of something basic like inheritance, but the problem is I shouldn't need to.

As for prefixing, I'm torn between two sides. In languages where multiple implementation inheritance is forbidden, the prefix warns you in case you want to extend two "normal" abstract classes. Even before you write the "extends" keyword. On the other hand, you get a situation where "all abstract classes are equal, but some are more equal than others". Classes that are not part of the interface-or-class struggle don't need to know if a type is an interface implementation, and abstract class or an imlementation class.
 Mon, 12 Sep 2005 10:36:17, James Grenning, Ditch the I, and th m_, and the DAO
Get rid of the "I". Also you C++ programmers should get rid of the "m_" and you DAO fans should get rid of the DAO suffix. All these annotations put up a barrier to being able to "read" the code. When I read ICommand I feel like there is something missing, like I Command you to take out the garbage. When I see the C++ programmers "m_" I find myself humming while I read the code mmmmm counter, mmmmm index, mmmmm transmitter, mmmm window. Was the programmer really happy and humming, or were they confused and not sure what variable name to use? Probably the latter. Ohhh. a member variable, that is something special.

DAO? I'd like to be reading in code in the problem domain if possible. DAO is a solution domain suffix. High level design should try to speak to the reader in the problem domain. What reads better
Employee.pay() or EmployeeDAO.pay().

I guess I went a little off topic, but i feel better now.

Micah, please leave off the "I" and explain your reasoning. Maybe this will help Microsoft programmers to break themselves away from hungarian notation.

James
 Mon, 12 Sep 2005 09:54:52, Tom Rossen, RE: Single-inheritance rant
Ignoring for the moment the issues of thread and type safety (generics are now available in Java 5)....

When I ... graduated ... from C++ to Java, I was pissed about the lack of multiple inheritance. But eventually all the OO propaganda about "composition" started to make sense. I don't know PHP, but I've created collections in Java that were similar to the one you describe.

Essentially you're building multiple indexes over a single collection, and it should be possible to do this by composition in any reasonable OO language. I suspect that a combination of abstraction (polymorphism and interfaces) and composition could solve just about any problem that presents itself as requiring multiple inheritance.
 Mon, 12 Sep 2005 08:50:35, sdc, dsd
ff
 Fri, 9 Sep 2005 18:31:17, David Chelimsky, super-abstractness
Willem Bogaerts wrote "the \"abstractness\" level of the super class is surely an implementation detail. And implementation (or the lack of it) should not be visible to outsiders!".

If I'm a derived class, I need to know what I can use from my base classes and what I have to define, don't I? This seems fundamentally different from "outsiders" that don't care whether method X is implemented by me or an ancestor. I'm NOT an outsider. I'm very much in the family.
 Fri, 9 Sep 2005 14:12:50, Willem Bogaerts, Single-inheritance rant
>But now we have strongly typed languages.

If you mean java, we have not. In the XP yahoo group I held a little rant about the claim that multiple inheritance would be difficult. On the contrary.

In my example, I wanted to write a collection class for PHP (it is not built-in). As I wanted to select the members by position or by name, they should inherit the NamedItem[?] interface. After I did that, I noticed that all the subclasses shared the same code for setting, keeping and retrieving the name, so I changed the interface to an abstract class and put the code there. But that prevented me to store Tests, DatabaseQueries[?], and lots of other useful classes in a collection. Now, what is difficult about a class being able to have two superclasses? On the contrary, try to explain what is wrong with it.
Single-inheritance-with-interfaces is a defect in languages that claim to be object-oriented. I think the prefix was introduced because it facilitates the programmer in typing the right keyword (extends or implements). But that keyword should not be necessary, as the "abstractness" level of the super class is surely an implementation detail. And implementation (or the lack of it) should not be visible to outsiders!
Back to the collection, the language prohibits the use of a general abstract member class. Now I understand why, in an so-called "strongly typed" language like java, every internal storage has to be done with general "object"s and a lot of casting. I don't see any strongly typed language here.

In short, this interface thingy is a language bug that you keep fighting day by day.
 Wed, 7 Sep 2005 09:17:23, Tom Rossen, Awkwardness...
I'm imagining Copernicus' reviewers saying "Hey - where are all the epicycles?"

Can't do anything about the FCL - untouchable legacy. But I trust Micah can handle this issue in the introduction. It shouldn't be any more awkward than explaining how agile techniques differ from traditional approaches.
 Tue, 6 Sep 2005 22:35:39, Naim Ru, Recanting...

OK, I can buy the arguments for going against the MS standard, and promote a new standard in C# for naming interfaces (in the hope of someday getting rid of the last remnants of Hungarian notation). Meanwhile we're still stuck with the IPrefixedInterfaceNames in the .NET FCL; perhaps being inconsistent with this is not such a big deal.

However, remember that Micah posed this issue in the context of writing a book, where he may need to deal with the awkwardness of having two different naming convention for interfaces in print (his own + FCL's), with no IDE or complier to help out the faithful reader... yeah, that could confuse some people.
 Tue, 6 Sep 2005 16:02:18, Liz, ICommand
I was translating a ton of java into C# and discovered that I had to add the I. Otherwise property conventions didn't work correctly.

I did it, I hate it, and now I wonder if I have to change the java to mirror it.

 Tue, 6 Sep 2005 14:56:28, Marco A, can't stop the huns
For whatever reason, people get addicted to the Hungarian notation and can't let it go.
 Tue, 6 Sep 2005 09:45:10, Tom Rossen, As IF....
Having fought the battle on multiple projects against pre- or post-pending "IF" in Java, I would urge you to take a stand against "tradition" (~= "antipattern") here.

An interface has two audiences: users and extenders. If you're using it, you shouldn't care whether it's an interface or a class, unless you try to instantiate it, and the compiler will tell you soon enough. Most of the time you're not instantiating, just referencing.

If you're extending it, you need to look at the doc or the source anyway.

Well-written code (small classes, small methods, clear separation of responsibilities) has no need of Hungarian notation. What makes code readable is its approximation of natural language - Hungarian notation degrades that.
 Tue, 6 Sep 2005 09:20:20, Casey, Ummmm - hardly an issue
If you don't use I.... then you are creating a different standard to the one almost all MS/C# programmers adhere to. So the first problem is confusion.

The second and bigger one is, the example you give is flawed. If you hose to turn your ICommand interface to a base class, then renaming isn't going to be the issue (as it can all be done by Resharper or equivalent in a few seconds at most). The problem is that many classes implementing your ICommand interface could now be very broken due to C# not supporting MI - so anything that has a base class already isn't going to work... and now you have a massive architectural nightmare.

Think first.... Agile doesn't mean 'hack it together quick and fix it later'
 Sun, 4 Sep 2005 02:32:59, Naim Ru, Interfaith Interfaces?

The .NET framework is filled with IPrefixedInterface names.

Looking at the following examples in printed form, which would be preferable, the following with inconsistent naming and perhaps a comment?

class Unforgettable : Fire, IList // yucky comment: Fire is an interface
{
// ... implementation here
}


ā€¦ or this following one?

class Joshua : ITree, IList
{
// ... implementation here
}


So, on the one hand you have potential confusion due to inconsistent naming conventions and you violate typical C# developer expectations. In printed code, you may have to add comments to help sort things out.

On the other hand, you have a notation (I*) that gives client code knowledge about a type it references which it does not strictly need (that it is an interface), which results in some small but not insignificant renaming pain that can happen during refactoring.

This is an interesting dilemma and a very close call. I prefer the consistency, and until I see better reasons to abandon the C# interface naming standard, I will continue to use IPrefixedInterface names in .NET code.
 Sat, 3 Sep 2005 10:07:16, keithray, about I-prefixed interfaces
See if you can put this story ("how I started with an I-prefixed interface, then changed it to an abstract class, and all the pain the 'I' caused") early in the book, and say that for the remainder of this book, you will not use the I prefix and that you recommend your readers don't do that either.

 Thu, 1 Sep 2005 01:03:01, Chaz Haws, When to use an interface
Okay, Micah, one more comment:
In my work, the behavior that I'm looking for would almost always allow either an interface or an abstract class. I have so far tended to use the interface notation, due to brevity. But my most likely line of evolution is just as you describe. I make those kinds of changes all the time.

Come to think of it, I probably shouldn't be using interfaces at all. I almost never need the multiple interface inheritance.

Thanks for making me think about it. I think I'm going to write a lot fewer interfaces.
 Thu, 1 Sep 2005 00:44:31, Chaz Haws, Interface notation
I don't care for it myself.
But I can see it being useful because it instantly communicates that multiple inheritance is an option for this type (as others have mentioned).
If you can't personally run all the tests that will demonstrate that converting from an interface to an abstract class, then that conversion isn't really an ideal option for you anyway (as others have mentioned).
If you've got those tests available, then this is a real issue, and you've got a good solution. People will nitpick it, though. Especially people that don't think about source control all the time, like I do.
How many reviewers reported that they were confused, as opposed to guessing that other people would be confused? I'm guessing *zero*.
I hate having to make such choices. I feel your pain. Two parting thoughts: Multiple implementation inheritance would make this (rather artificial) distinction completely moot. I still miss that from C++. And we need more advanced source control to deal with just such refactorings. Name changes as classes evolve and source control needs to deal with that.
 Wed, 31 Aug 2005 10:14:13, David Chelimsky,
I think that a very interesting part of this discussion is the notion of where (in a design) to use an interface and when (in process) to introduce one.

As for the "when" question, the notion that we should introduce them earlier rather than later denies (defies?) TDD as a design activity. We don't know that we need an interface until we need one, right? Experience suggests that in many cases this will be very early on - perhaps even during domain model discussions (if that's your approach), but it also suggests that we won't catch all the cases up front, at which point we need to be able to easily introduce interfaces later in the process.

As for the notion of the correct way to use interfaces, I find it limiting to suggest that they are only there to describe behavior. Agreed that interfaces serve us well as definitions of behavior, but I think that they also serve us well as definitions of structure. Expressing high level domain concepts like People and Accounts in interfaces allows us to begin to decouple implementations of one from the other, allowing us to choose where to start development based on business value rather than a preconceived notion of architecture. Not to mention dependency injection frameworks that prefer interfaces.
 Wed, 31 Aug 2005 05:15:20, Thomas Eyde, Published code and Java vs C# syntax
Some additional points: If this is a published library, then a rename will break existing code outside of our control.

Java has "extends" and "implements" which communicates very well on what the class inherits. C# doesn't have that, instead it uses the convention that if you inherit from more than one thing, the first must be a class and there can only be one class.

I am not sure that is a valid argument. I used to think Hungarian notation was a good thing, too. I guess I have to try out my interfaces without the "I" before I make up my mind.
 Wed, 31 Aug 2005 05:27:14, E Charignon, Make your book easy to read.
As it was said before, modern tools tend to make this 'I' prefix convention useless. Unfortunatly, you don't find this ease of navigation when reading a book and naming convention helps a lot. I made myself the same reflexions about comments. It might be wellcome to overcomment code examples into books because you are explaining something (you want to say in plain english what the code is doing) and the code there is not going to change once it has been printed. At the contrary, you would not make so many comments in you code since it will make bugs difficult to find and comments maintenance is a nightmare.
 Wed, 31 Aug 2005 03:14:17, J Drakes, Use of interfaces
I see. you suggest to use the -I*- like the -*able- in java.
Interfaces that are root to "structural" objects rather than "behaviour" objects (pardon the artificial distinction) would drop the I. Behaviour tags would keep the I.
 Tue, 30 Aug 2005 23:47:40, Eddie de Bear, Time to learn about correct use of Interfaces.
I think you may actually be incorrectly using interfaces if this is the problem you have with them... Interfaces should be used for secondary behaviour, such as IDispose, ISerializable, IBendable, IKickable etc. i.e. where actually deriviving from a base type is not going to work. You can still treat each object as it's interface type, even though it's not... Maybe the I should be changed to B for Behaviour!!!

The real question is, do you have a problem with the I prefix, or do you have a problem correctly selecting between an interface and class in early stages of development???

Just my 2cents...
 Tue, 30 Aug 2005 21:09:27, Keith Nicholas, Aye Aye Cap'n
I think you should keep the I, but explain your reasoning why you don't like it. However, your justification (so far) sucks. "I don't like "I"s because my tools make it slighlty painful to change it later" dosnt seem like a really good reason to not have "I"'s. If that was your reasoning then if resharper was made to automatically do a rename in subversion you'd be happy. Also if this was the case, then you should do some design up front so you get the names of classes right before you start work because its "painful" to change later....hmmm...

5 years of Extreme programming with C# using I's and I've never thought, "bugger, I wish we hadn't prefixed an "I" onto this". When we have needed, we have simply taken the I off and made it abstract. no big deal. I did have a bit of hesitation in the beginning with pushing type information into the name but it really hasn't hurt us. If you do think it does major harm I'd be interested in your reasoning that adds up to a "good deal of hurt" if you had a tool that renamed ICommand to something else in < 1 second....

 Tue, 30 Aug 2005 13:33:14, Ivan Vaghi, Hungarian Notation
Contrary to popular belief there seem to be some valid uses for Hungarian Notation, and it looks like its actual use is quite different from what it was originally intended for.

Spolsky here explains how to use it to create a 'conventional' algebra
<http://www.joelonsoftware.com/articles/Wrong.html>

eg: daysTime1 + daysTime2 is correct
daysTime1 + hoursTime2 is not correct
daysTime1 + daysFromHours(hoursTime2) is correct

where both days and hours are plain numbers

Of course, you could use typed objects, but sometimes it's just not worthed.
Also note that the prefix indicates the semantic value of the number, rather than just pointing out the type of the number.

Hungarian-like naming can also be useful in dynamic languages for readibility.

eg: def enroll(a_person) #a_person is (usually :-) of type Person
def distance(start_point,end_point) #it's getting 2 Point instances

The hungarian-like style I like gives semantics, not just syntax.
 Tue, 30 Aug 2005 13:01:44, Ben, IThinkYouMayBeOnToSomething
Its funny, I've always hated the Hungarian notation. I'm not sure if this convention arose from that hideous pain of a standard, but for one reason or another its a standard that I happened to adopt. Probably for no other reason than seeing it used in other peoples code. What can I say, monkey see, monkey do.

What do you think of the hungarian notation? The developers at my new company love it, and really don't want to abandon it, despite all my pleas. From what I've seen on the web, there are valid arguments on both sides of the coin.
 Tue, 30 Aug 2005 11:35:33, Bill de hOra,
Your reviewers are guilty of foolish consistency. Java went through this way back when with people appending IF to interface names, before it was dropped as a bad idea. Interfaces in static languages have enough problems without this cruft.

Put it this way. If you don't drop the convention, I won't buy the book.
 Tue, 30 Aug 2005 10:06:18, ,
Ni vagree!
 Tue, 30 Aug 2005 08:40:08, UncleBob[?], Hungarian Notation in the 21st Century.
There was a time, long long ago in the 80s, when C compilers did not check types and programmers could pass ints to functions that took doubles, that Hungarian Notation (HN) might just possibly have been a sort of OK idea maybe. Maybe.

But now we have strongly typed languages. Now we have IDEs that tell us the type of an identifier simply by hovering over them with the mouse. Now we have Object Oriented languages which soften the notions of "type" and nudges us to hide type rather than expose it.

So, Captain, we are no longer using stone knives and bearskins to get our jobs done. And we don't need the trappings that, just possibly, maybe, were a sort-of good-idea in the stone age.

I don't miss the I's (or the C's, or the pszqt's)

Hay! I've got a great idea! Let's prefix every noun with an (n), and every nadjective with an n(j), and jevery nadverb with a n(y), and jevery nverb with a (v). nPeople vwill yquickly vfind nEnglish vto vbe jmuch jmore jreadable.
 Tue, 30 Aug 2005 08:19:45, Michael Feathers,
Micah wrote "Here's my dilemma. Maybe you can help me. I've been translating Unclebob's Agile Software Development book into C#. In the code examples I have heeded my experience and created interfaces without the I prefix. Reviewers don't like this. They keep telling me that I need to add the I prefix. I'm torn. I feel in my heart of hearts that using the IConvention does more harm than good. Yet, readers of the book will be familiar with the IConvention and may be confused by examples that don't use it. What's the right thing to do?"

I think you should use the convention after you tell the reader just how stupid it is. I think I railed against it in WELC twice. Couldn't help myself.
 Tue, 30 Aug 2005 08:10:58, Mark Howard, drop the I
Drop the I, it's a bad convention that should have never been there. It smells of hungarian notation
 Tue, 30 Aug 2005 06:46:21, Dave Hoover, I Agree with Thomas and Tanton
Don't rename the interface, implement it with an abstract class. Stick with the conventions of the language, readable code is the first step toward adaptable code.
 Tue, 30 Aug 2005 04:46:50, Thomas Eyde, IConvention
I don't really see the problem. Why don't you just do:

abstract class Command : ICommand {}

And you could have both artifacts in the same file, without the "I": Command.yourfavouritedotnetlanguagesuffix
 Tue, 30 Aug 2005 03:14:07, J Drakes, Different audiences
This is not the first time I meet this problem. The way I see it, if you are a 'user' of the classes, then you shouldn't care at all about the implementation details of what you call, you should just focus on the readability of the code. You are just using some 'words' whose use and semantics are expressed by the test cases. However, the very moment you decide that you want to modify that code, then you get swamped in all kinds of syntax-oriented technicalities and then you have to care about the nature of the 'words' that you are using, and those 'I' that you use to colour your code can be handy. I don't have one single answer for you.. it depends. I usually solve it giving my expert users a set of classes that are easy on their eyes, while those classes are based on rougher code with the 'I's still sticking out.
 Mon, 29 Aug 2005 23:51:02, Tanton Gibbs, Additional Problems
I think there are more problems than just renaming files. If ICommand were an interface, then it could participate in Java and C#'s multiple inheritance of interfaces. However, if ICommand were suddenly changed to a class, many classes could be unduly broken. Determining if an entity should be an interface or a class should be something you strive to get right at the beginning. If you choose the interface prematurely, then I would think that you could add a helper class CommonCommandImpl[?] that instantiates the interface and other classes could choose to inherit from if it made sense for them. Therefore, I think the least of your worries is renaming and if you chose to put the I in front of interfaces it would be neither a good nor a bad decision. I think if it is an "idiom" for that language, then you should follow it.