How to convert an integer to ASCII character ?

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • akarui.tomodachi@gmail.com

    How to convert an integer to ASCII character ?

    What is the most easiest way to convert an integer value to ASCII
    character format ?
    I tried with sprintf(). It works.
    Is there any other way to do that ?

    Objective::
    I like to convert an integer value of 3 and write into a string buffer.

    What I did:
    .....
    .....
    char myStr[];
    int myInt = 3;
    sprintf(myStr, %d,myInt);
    .....
    .....

    Please comment.

  • serrand

    #2
    Re: How to convert an integer to ASCII character ?

    akarui.tomodach i@gmail.com wrote:[color=blue]
    > What is the most easiest way to convert an integer value to ASCII
    > character format ?
    > I tried with sprintf(). It works.
    > Is there any other way to do that ?
    >
    > Objective::
    > I like to convert an integer value of 3 and write into a string buffer.
    >
    > What I did:
    > ....
    > ....
    > char myStr[];
    > int myInt = 3;
    > sprintf(myStr, %d,myInt);
    > ....
    > ....
    >
    > Please comment.
    >[/color]

    #define DIGILEN log10 (MAX_INT) +2

    char buf[DIGILEN];
    sprintf(buf, "%d", int_var);

    Xavier

    Comment

    • serrand

      #3
      Re: How to convert an integer to ASCII character ?

      serrand wrote:[color=blue]
      > akarui.tomodach i@gmail.com wrote:
      >[color=green]
      >> What is the most easiest way to convert an integer value to ASCII
      >> character format ?
      >> I tried with sprintf(). It works.
      >> Is there any other way to do that ?
      >>
      >> Objective::
      >> I like to convert an integer value of 3 and write into a string buffer.
      >>
      >> What I did:
      >> ....
      >> ....
      >> char myStr[];
      >> int myInt = 3;
      >> sprintf(myStr, %d,myInt);
      >> ....
      >> ....
      >>
      >> Please comment.
      >>[/color]
      >
      > #define DIGILEN log10 (MAX_INT) +2
      >
      > char buf[DIGILEN];
      > sprintf(buf, "%d", int_var);
      >
      > Xavier[/color]

      oops... sorry

      #define DIGILEN (int)(log10 (MAX_INT) +3)

      Your way seems to be the simpliest...

      sprintf is doing the same job as printf : wheras printf outputs in stdin
      sprintf outputs in its first argument, which have to be an allocated string

      Xavier

      Comment

      • OJ

        #4
        Re: How to convert an integer to ASCII character ?

        Maybe itoa could be used. But it's not a standard function.

        Comment

        • shichongdong

          #5
          Re: How to convert an integer to ASCII character ?

          int i = 3;
          char c = i + '0';

          Comment

          • Lew Pitcher

            #6
            Re: How to convert an integer to ASCII character ?

            -----BEGIN PGP SIGNED MESSAGE-----
            Hash: SHA1

            shichongdong wrote:[color=blue]
            > int i = 3;
            > char c = i + '0';[/color]

            ???

            int i = 300;
            char c = i + '0' ; /* nope. not an ascii character */

            - --
            Lew Pitcher

            Master Codewright & JOAT-in-training | GPG public key available on request
            Registered Linux User #112576 (http://counter.li.org/)
            Slackware - Because I know what I'm doing.
            -----BEGIN PGP SIGNATURE-----
            Version: GnuPG v1.2.7 (GNU/Linux)

            iD8DBQFD9+6zagV FX4UWr64RAlFEAJ 9CA4LmNOY13Nry6 tTDnT5zrlcO3gCf U/hU
            AGWsWWx0Al9HQaZ tiP46Gas=
            =OS08
            -----END PGP SIGNATURE-----

            Comment

            • CBFalconer

              #7
              Re: How to convert an integer to ASCII character ?

              akarui.tomodach i@gmail.com wrote:[color=blue]
              >
              > What is the most easiest way to convert an integer value to ASCII
              > character format ?
              > I tried with sprintf(). It works.
              > Is there any other way to do that ?
              >
              > Objective::
              > I like to convert an integer value of 3 and write into a string buffer.[/color]

              #include <stdio.h>

              /* ---------------------- */

              static void putdecimal(unsi gned int v, char **s) {

              if (v / 10) putdecimal(v/10, s);
              *(*s)++ = (v % 10) + '0';
              **s = '\0';
              } /* putdecimal */

              /* ---------------------- */

              int main(void) {

              char a[80];

              char *t, *s = a;

              t = s; putdecimal( 0, &t); puts(s);
              t = s; putdecimal( 1, &t); puts(s);
              t = s; putdecimal(-1, &t); puts(s);
              t = s; putdecimal( 2, &t); puts(s);
              t = s; putdecimal(23, &t); puts(s);
              t = s; putdecimal(27, &t); puts(s);
              return 0;
              } /* main */

              --
              "If you want to post a followup via groups.google.c om, don't use
              the broken "Reply" link at the bottom of the article. Click on
              "show options" at the top of the article, then click on the
              "Reply" at the bottom of the article headers." - Keith Thompson
              More details at: <http://cfaj.freeshell. org/google/>
              Also see <http://www.safalra.com/special/googlegroupsrep ly/>

              Comment

              • Thad Smith

                #8
                Re: How to convert an integer to ASCII character ?

                akarui.tomodach i@gmail.com wrote:[color=blue]
                > What is the most easiest way to convert an integer value to ASCII
                > character format ?[/color]

                Lew Pitcher wrote:[color=blue]
                > shichongdong wrote:[color=green]
                >>int i = 3;
                >>char c = i + '0';[/color]
                >
                > ???
                >
                > int i = 300;
                > char c = i + '0' ; /* nope. not an ascii character */[/color]

                Interesting. Neither is the earlier code converting 3 guaranteed to
                produce ASCII.

                --
                Thad

                Comment

                • Kenny McCormack

                  #9
                  Re: How to convert an integer to ASCII character ?

                  In article <j5SJf.7348$_D5 .487766@news20. bellglobal.com> ,
                  Lew Pitcher <lpitcher@sympa tico.ca> wrote:[color=blue]
                  >-----BEGIN PGP SIGNED MESSAGE-----
                  >Hash: SHA1
                  >
                  >shichongdong wrote:[color=green]
                  >> int i = 3;
                  >> char c = i + '0';[/color]
                  >
                  >???
                  >
                  > int i = 300;
                  > char c = i + '0' ; /* nope. not an ascii character */[/color]

                  The OP was asking how to do it with 3, not 300. You need to keep up.

                  Comment

                  • Martin Jørgensen

                    #10
                    Re: How to convert an integer to ASCII character ?

                    Lew Pitcher wrote:[color=blue]
                    > -----BEGIN PGP SIGNED MESSAGE-----
                    > Hash: SHA1
                    >
                    > shichongdong wrote:
                    >[color=green]
                    >>int i = 3;
                    >>char c = i + '0';[/color]
                    >
                    >
                    > ???
                    >
                    > int i = 300;
                    > char c = i + '0' ; /* nope. not an ascii character */[/color]

                    It works for single digits, right?


                    Best regards / Med venlig hilsen
                    Martin Jørgense

                    --
                    ---------------------------------------------------------------------------
                    Home of Martin Jørgensen - http://www.martinjoergensen.dk

                    Comment

                    • stathis gotsis

                      #11
                      Re: How to convert an integer to ASCII character ?

                      "Martin Jørgensen" <unoder.spam@sp am.jay.net> wrote in message
                      news:pkrmc3-l14.ln1@news.td c.dk...[color=blue]
                      > Lew Pitcher wrote:[color=green]
                      > > -----BEGIN PGP SIGNED MESSAGE-----
                      > > Hash: SHA1
                      > >
                      > > shichongdong wrote:
                      > >[color=darkred]
                      > >>int i = 3;
                      > >>char c = i + '0';[/color]
                      > >
                      > >
                      > > ???
                      > >
                      > > int i = 300;
                      > > char c = i + '0' ; /* nope. not an ascii character */[/color]
                      >
                      > It works for single digits, right?[/color]

                      Assuming ASCII it does.


                      Comment

                      • Lew Pitcher

                        #12
                        Re: How to convert an integer to ASCII character ?

                        -----BEGIN PGP SIGNED MESSAGE-----
                        Hash: SHA1

                        stathis gotsis wrote:[color=blue]
                        > "Martin Jørgensen" <unoder.spam@sp am.jay.net> wrote in message
                        > news:pkrmc3-l14.ln1@news.td c.dk...[color=green]
                        >> Lew Pitcher wrote:[color=darkred]
                        >>> -----BEGIN PGP SIGNED MESSAGE-----
                        >>> Hash: SHA1
                        >>>
                        >>> shichongdong wrote:
                        >>>
                        >>>> int i = 3;
                        >>>> char c = i + '0';
                        >>>
                        >>> ???
                        >>>
                        >>> int i = 300;
                        >>> char c = i + '0' ; /* nope. not an ascii character */[/color]
                        >> It works for single digits, right?[/color]
                        >
                        > Assuming ASCII it does.[/color]

                        Assuming any conforming C implementation, it does. The C standard guarantees it.

                        - --
                        Lew Pitcher

                        Master Codewright & JOAT-in-training | GPG public key available on request
                        Registered Linux User #112576 (http://counter.li.org/)
                        Slackware - Because I know what I'm doing.
                        -----BEGIN PGP SIGNATURE-----
                        Version: GnuPG v1.2.7 (GNU/Linux)

                        iD8DBQFD+QbaagV FX4UWr64RAkS4AJ 9H8kT8tck4HFxxh C2f+xmDPRRu5QCg nbor
                        SX0i8pvlRNTifgv wU0h9Od0=
                        =YdQ6
                        -----END PGP SIGNATURE-----

                        Comment

                        • Ben Pfaff

                          #13
                          Re: How to convert an integer to ASCII character ?

                          Lew Pitcher <lpitcher@sympa tico.ca> writes:
                          [color=blue]
                          > stathis gotsis wrote:[color=green]
                          >> "Martin Jxrgensen" <unoder.spam@sp am.jay.net> wrote in message
                          >> news:pkrmc3-l14.ln1@news.td c.dk...[color=darkred]
                          >>> Lew Pitcher wrote:
                          >>>> -----BEGIN PGP SIGNED MESSAGE-----
                          >>>> Hash: SHA1
                          >>>>
                          >>>> shichongdong wrote:
                          >>>>
                          >>>>> int i = 3;
                          >>>>> char c = i + '0';
                          >>>>
                          >>>> ???
                          >>>>
                          >>>> int i = 300;
                          >>>> char c = i + '0' ; /* nope. not an ascii character */
                          >>> It works for single digits, right?[/color]
                          >>
                          >> Assuming ASCII it does.[/color]
                          >
                          > Assuming any conforming C implementation, it does. The C
                          > standard guarantees it.[/color]

                          The C standard guarantees that decimal digits are sequential and
                          in the proper order. The C standard doesn't guarantee that the
                          execution character set is ASCII. The OP asked to convert an
                          integer value to *ASCII* character format specifically.

                          Here's a portable way to get a single ASCII digit: 48 + num.
                          --
                          int main(void){char p[]="ABCDEFGHIJKLM NOPQRSTUVWXYZab cdefghijklmnopq rstuvwxyz.\
                          \n",*q="kl BIcNBFr.NKEzjwC IxNJC";int i=sizeof p/2;char *strchr();int putchar(\
                          );while(*q){i+= strchr(p,*q++)-p;if(i>=(int)si zeof p)i-=sizeof p-1;putchar(p[i]\
                          );}return 0;}

                          Comment

                          • Flash Gordon

                            #14
                            Re: How to convert an integer to ASCII character ?

                            stathis gotsis wrote:[color=blue]
                            > "Martin Jørgensen" <unoder.spam@sp am.jay.net> wrote in message
                            > news:pkrmc3-l14.ln1@news.td c.dk...[color=green]
                            >> Lew Pitcher wrote:[color=darkred]
                            >>> -----BEGIN PGP SIGNED MESSAGE-----
                            >>> Hash: SHA1
                            >>>
                            >>> shichongdong wrote:
                            >>>
                            >>>> int i = 3;
                            >>>> char c = i + '0';
                            >>>
                            >>> ???
                            >>>
                            >>> int i = 300;
                            >>> char c = i + '0' ; /* nope. not an ascii character */[/color]
                            >> It works for single digits, right?[/color]
                            >
                            > Assuming ASCII it does.[/color]

                            Assuming an implementation that conforms to the C standard it does,
                            whether it is ASCII or not. It's one of the few things the C standard
                            guarantees about the execution character set.
                            --
                            Flash Gordon
                            Living in interesting times.
                            Web site - http://home.flash-gordon.me.uk/
                            comp.lang.c posting guidlines and intro -

                            Comment

                            • stathis gotsis

                              #15
                              Re: How to convert an integer to ASCII character ?

                              "Flash Gordon" <spam@flash-gordon.me.uk> wrote in message
                              news:2ndnc3xp47 .ln2@news.flash-gordon.me.uk...[color=blue]
                              > stathis gotsis wrote:[color=green]
                              > > "Martin Jørgensen" <unoder.spam@sp am.jay.net> wrote in message
                              > > news:pkrmc3-l14.ln1@news.td c.dk...[color=darkred]
                              > >> Lew Pitcher wrote:
                              > >>> -----BEGIN PGP SIGNED MESSAGE-----
                              > >>> Hash: SHA1
                              > >>>
                              > >>> shichongdong wrote:
                              > >>>
                              > >>>> int i = 3;
                              > >>>> char c = i + '0';
                              > >>>
                              > >>> ???
                              > >>>
                              > >>> int i = 300;
                              > >>> char c = i + '0' ; /* nope. not an ascii character */
                              > >> It works for single digits, right?[/color]
                              > >
                              > > Assuming ASCII it does.[/color]
                              >
                              > Assuming an implementation that conforms to the C standard it does,
                              > whether it is ASCII or not. It's one of the few things the C standard
                              > guarantees about the execution character set.[/color]

                              I was not aware of that, thanks for the correction.


                              Comment

                              Working...