Child Window Message Capture...

SephirothSephiroth Fayetteville, NC, USA
OK, I need to figure out how to tell when enter is pressed in a child edit control. I can easily find out when the child sends a WM_PARENTNOTIFY message, but how do I tell what it is and parse it out? The edit control is a simple edit, single line control. I want enter to do the function that grabs whatever has been typed in that edit control, if any. Grabbing the text is no problem, but making enter do anything but cause a "ding" is getting on my nerves. If this can be done without assigning a child window procedure, could somebody paste me something explaining or demoing how? Thanks.

-[italic][b][red]S[/red][purple]e[/purple][blue]p[/blue][green]h[/green][red]i[/red][purple]r[/purple][blue]o[/blue][green]t[/green][red]h[/red][/b][/italic]

Comments

  • You should try to catch WM_COMMAND messages with EN_KILLFOCUS in
    HIWORD(wParam) and your edit control ID in LOWORD(wParam).
    This will work surely if you click elsewhere on the screen (so the edit loses focus), but I think it should work also with Enter.
    Note that if the edit control is in a dialog and you have a default button, pressing enter will send a WM_COMMAND message with the default button ID in LOWORD(wParam).

    nICO

    [hr]
    [italic]How beautiful, if sorrow had not made Sorrow more beautiful than Beauty itself.[/italic]
    JOHN KEATS


  • : Note that if the edit control is in a dialog and you have a default button, pressing enter will send a WM_COMMAND message with the default button ID in LOWORD(wParam).

    This is the easiest way to do what you want I believe, since you want to mess with the Enter key, I gather you dont care about the default button, so:

    1. Add a button to the dialog, make it default (add BS_DEFAULT or from the resource editor). And make it invisible.
    2. In your WM_COMMAND, do this:
    if(LOWORD(wParam) == ID_MY_DEFAULT_BUTTON)
    {
    char buffer[800];
    GetDlgItemText(hWnd, ID_MY_EDIT_BOX, buffer, sizeof(buffer));
    MessageBox(hWnd, buffer, "test", 0);
    }


  • SephirothSephiroth Fayetteville, NC, USA
    Hmm, shouldn't that "sizeof(buffer)" be "800" :p ?

    Anyways, I have created a button that way already with the "BS_DEFPUSHBUTTON" style (BS_DEFAULT was undefined at compile), but I can type a string in the text box and then press enter to hear that sadistic "ding". I'm going to change that to something better when I am done here because I am so sick of hearing it :p!

    Oh, I also have an image of the application working (although not connected to the server app) so you can get a better idea of what I am trying to do here. The top window is gray because it is an OpenGL viewport that would be displaying a scene if I was connected to the server. The middle window is the backlog of player chat and such, and the button is the one with the style mentioned above. The last edit control (bottom one) is what dings whenever I press enter, no matter what I do.

    http://dhta.oesm.org/Client.jpg

    -[italic][b][red]S[/red][purple]e[/purple][blue]p[/blue][green]h[/green][red]i[/red][purple]r[/purple][blue]o[/blue][green]t[/green][red]h[/red][/b][/italic]

  • : Hmm, shouldn't that "sizeof(buffer)" be "800" :p ?
    :
    [blue]Yep. Most likely, it will be 800. The [b]sizeof()[/b] needed to force compiler to care about the size. Say, I have this:[code]
    char str [800];

    foo (..., 800);
    [/code]When you wan to modify 800 into, say,1024 - you need to modify two (or more) places instead of one. With the use of [b]sizeof()[/b] - you need only to modify the array size when declaring it.

    Here is another tip: usually when you pass the buffer room into a function which fills the buffer - you should pass the number of characters. What if you want to use [b]WCHAR[/b] to be faster (a little) on XP/NT/2000. [b]sizeof()[/b] will give the wrong impression of a buffer being 800x2=1600 symbols long, but in fact there will be only 800 [b]WCHAR[/b] symbols, because [b]sizeof()[/b] return size in bytes. So, this little macro will take care of that:[code]
    #define ARRITEMS(arr) (sizeof (arr) / sizeof (arr [0]))
    [/code]Now, it will be like that:[code]
    char str [800];

    foo (..., ARRITEMS (str));
    [/code]Now, if you want WCHARs - just replace the type 'char' into 'WCHAR' and you are done.[/blue]
  • SephirothSephiroth Fayetteville, NC, USA
    I only modify one place. A simple define in the defines header.
    [code]
    //Defines.h
    #define MAX_STR_LEN 1024

    //Source files
    char blah[MAX_STR_LEN];

    GetDlgItemText(..., MAX_STR_LEN);
    [/code]
    I thought it was supposed to be set to the length of the string and I was under the impression that "sizeof()" returned the amount of memory (in bytes) that the array used, not the length of a string.

    I still can't get it to do what I want it to with enter though. However, I am spending more time on my server app than the client for now, since I am only testing the stuff out and don't mind having to click the send button manually. Oh and let me tell you, do NOT write a Windoze C app, send it over to a Linux box, remove the winsock BS and expect it to compile! I found out that I had to add a TON of includes, and for some reason I had to change the extension to CPP to get one of my typedefs to work.

    -[italic][b][red]S[/red][purple]e[/purple][blue]p[/blue][green]h[/green][red]i[/red][purple]r[/purple][blue]o[/blue][green]t[/green][red]h[/red][/b][/italic]

  • : I only modify one place. A simple define in the defines header.
    : [code]
    : //Defines.h
    : #define MAX_STR_LEN 1024
    :
    : //Source files
    : char blah[MAX_STR_LEN];
    :
    : GetDlgItemText(..., MAX_STR_LEN);
    : [/code]
    : I thought it was supposed to be set to the length of the string and I was under the impression that "sizeof()" returned the amount of memory (in bytes) that the array used, not the length of a string.
    :
    : I still can't get it to do what I want it to with enter though. However, I am spending more time on my server app than the client for now, since I am only testing the stuff out and don't mind having to click the send button manually. Oh and let me tell you, do NOT write a Windoze C app, send it over to a Linux box, remove the winsock BS and expect it to compile! I found out that I had to add a TON of includes, and for some reason I had to change the extension to CPP to get one of my typedefs to work.
    :
    : -[italic][b][red]S[/red][purple]e[/purple][blue]p[/blue][green]h[/green][red]i[/red][purple]r[/purple][blue]o[/blue][green]t[/green][red]h[/red][/b][/italic]
    :
    :

    [blue][b]... I was under the impression that "sizeof()" returned the amount of memory (in bytes) that the array used, not the length of a string...[/b]

    That is exactly correct.

    char str [800]; // sizeof(str) returns 800
    WCHAR str [800]; // sizeof(str) returns 1600

    If you always will be using [b]char[/b] then, of course, [b]#define[/b] is the right way to go. If you decide to use [b]TCHAR[/b] or [b]WCHAR[/b] - then to use [b]#define[/b] will be wrong.[/blue]
  • SephirothSephiroth Fayetteville, NC, USA
    Ah, gotcha'! I don't plan on switching though. I also thought each "char" was 8bits, so char[800] with size would be like "800x8=6400". At least that is what I thought. Now I know better :p!

    -[italic][b][red]S[/red][purple]e[/purple][blue]p[/blue][green]h[/green][red]i[/red][purple]r[/purple][blue]o[/blue][green]t[/green][red]h[/red][/b][/italic]

Sign In or Register to comment.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Categories