Saturday, July 20, 2024

Coding is just the Beginning: A Personal Perspective

 Coding is just the Beginning: A Personal Perspective


Janpha Thadphoothon


This blog post explores coding in the age of AI. Discussing coding or programming might seem far removed from my work as an English teacher, but please bear with me. I aim to offer a non-specialist perspective on the issues surrounding coding in this new era of AI. 




By the way, have you been learning programming languages like Java or C?

I began my coding in the 1980s. In fact, I took a course and the first language I knew was Basic. Later I learned HTML. A few years ago, I began to look at Java and later JavaScript and Python. But I am no good at any of the said languages. 


I know there are several programming languages, and this is something I find quite relatable. Just like spoken languages (English or Japanese), programming languages have rules governing the organization of their elements. I once heard a seasoned programmer say in an interview that we should learn not just one language, but several. It's limiting to stick to only one language or method. This idea struck me as 'interesting' because I feel the same way about learning other languages. The more, the better.

What is a Code?


What is code? My understanding is simple: it's a set of rules created by humans to organize and manipulate signs or symbols. These rules allow us to communicate instructions to a computer.

At its most fundamental level, machine language is based on the binary system, which consists of only two options: 0 and 1. These binary digits, or bits, are the basic building blocks of all computer operations. Each 0 or 1 represents an electrical state: off or on, false or true.

In essence, the binary system allows computers to perform complex calculations and processes by combining these simple binary choices in various sequences. When you write code in a high-level programming language like Java or Python, it's eventually translated into binary code that the computer can execute.

Understanding the binary system provides insight into how computers operate at their core, and it highlights the elegance of transforming simple binary decisions into the sophisticated technology we use every day.

The Assembly Language

I vaguely recall that the most fundamental programming language, closest to machine language, is the assembly language. Assembly language provides a low-level, human-readable representation of machine code. Unlike high-level programming languages, which use abstractions and complex syntax, assembly language consists of simple instructions and mnemonics that correspond directly to the machine's binary code.

Each instruction in assembly language typically performs a specific task, such as moving data between registers, performing arithmetic operations, or controlling the flow of execution. These instructions are written using a set of symbols and opcodes, making the code easier for humans to read and write, though still challenging compared to higher-level languages.

Assembly language is hardware-specific, meaning that code written for one type of processor might not work on another. This close relationship with the hardware allows programmers to write highly optimized and efficient code, making assembly language essential for systems programming, embedded systems, and situations where performance and resource management are critical.

Example of Assembly Language Code

Here’s a simple example of an assembly language program for an x86 processor that adds two numbers and displays the result:

assembly
section .data num1 db 5 ; Define byte with value 5 num2 db 3 ; Define byte with value 3 result db 0 ; Define byte to store result section .bss section .text global _start _start: mov al, [num1] ; Move the value of num1 into register AL add al, [num2] ; Add the value of num2 to AL mov [result], al ; Move the result from AL into the result variable ; System call to print result (for Linux x86) mov eax, 4 ; sys_write system call number mov ebx, 1 ; File descriptor 1 (stdout) mov ecx, result ; Pointer to the result mov edx, 1 ; Number of bytes to write int 0x80 ; Call kernel ; System call to exit mov eax, 1 ; sys_exit system call number xor ebx, ebx ; Exit code 0 int 0x80 ; Call kernel

Explanation

  1. Data Section:

    • num1 and num2 are defined as bytes with values 5 and 3, respectively.
    • result is a byte defined to store the result of the addition.
  2. Text Section:

    • _start is the entry point of the program.
    • The value of num1 is moved into the AL register.
    • The value of num2 is added to the AL register.
    • The result in the AL register is moved to the result variable.
    • A system call is made to write the result to the standard output.
    • Another system call is made to exit the program.

Historical Development

Assembly language has been around since the early days of computing. The first assembly languages were developed in the 1940s and 1950s to make programming more accessible compared to directly writing machine code. Early computers like the IBM 704 and the UNIVAC I used assembly languages tailored to their specific architectures.

As computer architectures evolved, so did assembly languages. Each type of CPU has its own assembly language, tailored to its instruction set. For example, the x86 assembly language is used for Intel and AMD processors, while ARM assembly language is used for ARM processors.

Despite the development of higher-level programming languages, assembly language remains important for certain applications. It's used in system programming, device drivers, embedded systems, and situations where performance optimization and direct hardware control are crucial.

Learning assembly language can provide deep insights into how computers operate at the hardware level, enhancing one's understanding of programming concepts and computer architecture.

COBOL

Another programming language I’ve heard about is COBOL. It’s been a while since I last heard it mentioned. Are there any COBOL programmers left?

COBOL, which stands for Common Business-Oriented Language, was developed in the late 1950s and early 1960s. It was one of the first high-level programming languages and was designed specifically for business applications. The language was created as a result of a conference sponsored by the U.S. Department of Defense, which aimed to develop a standard programming language that could be used across different computer systems for business data processing.

COBOL is known for its readability and straightforward syntax, which resembles plain English. This makes it particularly well-suited for writing and maintaining business applications that handle large volumes of data and transactions. One of the key features of COBOL is its ability to process data stored in files and databases, making it invaluable for industries such as banking, insurance, and government, where data processing is critical.

Example of COBOL Code


```cobol
IDENTIFICATION DIVISION.
PROGRAM-ID. PrintRecords.

ENVIRONMENT DIVISION.
INPUT-OUTPUT SECTION.
FILE-CONTROL.
    SELECT InputFile ASSIGN TO 'input-data.txt'
    ORGANIZATION IS LINE SEQUENTIAL.

DATA DIVISION.
FILE SECTION.
FD InputFile.
01 InputRecord PIC X(100).

WORKING-STORAGE SECTION.
01 EndOfFile PIC X VALUE 'N'.

PROCEDURE DIVISION.
Begin.
    OPEN INPUT InputFile
    PERFORM UNTIL EndOfFile = 'Y'
        READ InputFile INTO InputRecord
            AT END MOVE 'Y' TO EndOfFile
        END-READ
        IF EndOfFile NOT = 'Y'
            DISPLAY InputRecord
        END-IF
    END-PERFORM
    CLOSE InputFile
    STOP RUN.
```

Historical Development and Current Usage

Despite its age, COBOL is still in use today. Many legacy systems, particularly in the financial sector, run on COBOL. According to some estimates, there are hundreds of billions of lines of COBOL code still in operation. These systems are often mission-critical, and replacing them with newer technologies can be risky and expensive.

As a result, there is still a demand for COBOL programmers, especially for maintaining and updating these legacy systems. Many organizations rely on experienced COBOL developers to ensure their systems continue to function smoothly. Some universities and training programs still teach COBOL, recognizing its ongoing relevance in specific sectors.

In recent years, there has been a renewed interest in COBOL due to the need to modernize and maintain existing systems, particularly with the advent of digital transformation initiatives. While it may not be as popular as newer languages like Python or Java, COBOL remains a significant part of the programming landscape.

The Challenge of Learning to Code


At this point, I’d like to reflect on the past years of learning coding and computer science. It has often seemed like a daunting challenge, one that only the brainiest students—those "computer nerds"—would pursue. I must admit, I didn't feel smart enough to enter this field of study. Consequently, my knowledge of computing has remained peripheral and amateurish.

Reflecting on the Barriers

In the past, coding and computer science were seen as highly specialized areas, accessible only to those with exceptional analytical skills and a deep interest in technology. This perception created a barrier for many, including myself. The stereotype of the "computer nerd" often discouraged those who felt they didn't fit that mold, reinforcing the idea that coding was not for everyone.

Changing Perspectives

However, the landscape of coding education has significantly changed over the years. With the advent of user-friendly programming languages, online tutorials, and coding bootcamps, learning to code has become more accessible to a broader audience. Resources like Codecademy, Khan Academy, and freeCodeCamp have demystified coding, making it approachable for people of all backgrounds and skill levels.

Embracing Lifelong Learning

Although I didn’t dive into computer science early on, I’ve come to appreciate the value of lifelong learning. The realization that coding isn’t reserved for geniuses but is a skill that anyone can learn with dedication and practice has been empowering. While my knowledge of computing might still be amateurish, I see it as a starting point rather than a limitation.

Encouraging Inclusivity in Tech

It's important to encourage inclusivity in the tech field. By breaking down stereotypes and making coding education accessible, we can foster a more diverse and innovative tech community. Whether you're a student, a professional from a different field, or simply curious about coding, it's never too late to start learning. The journey might be challenging, but it's also rewarding and full of opportunities for growth and discovery.

Coding is Just the Beginning in Software Development

I was told by an experienced software developer that coding is just the beginning of the software development process. After you’ve written the code, say in Java, you need to run it and turn it into functional applications, either web apps or mobile apps. This involves not only executing the code but also considering compute costs and cloud storage costs.

For example, imagine you’ve written a simple Java program to add two numbers, X and Y, and output the result, Z. When users enter the numbers 2 and 5, the code will add them to produce the result 7. However, for this code to be useful and accessible to users, it needs to be deployed and maintained.

The Full Development Lifecycle

1. Coding: Writing the initial code is the first step. This involves using a programming language like Java to create the logic and functionality of your application.

2. Running and Testing: Once the code is written, it needs to be run and tested to ensure it works correctly. This involves debugging any issues and making sure the application performs as expected.

3. Deployment: After testing, the application needs to be deployed. This means making it available on a server or a cloud platform so that users can access it. For web applications, this might involve deploying to a web server, while for mobile apps, it might involve publishing to app stores.

4. Infrastructure Costs: Running applications involves infrastructure costs. You need to consider compute costs (the cost of running the server or cloud instance) and storage costs (the cost of storing the application data). Cloud providers like AWS, Google Cloud, and Azure offer various pricing models based on usage.

5. Maintenance and Updates: Software needs ongoing maintenance to fix bugs, add new features, and improve performance. This requires regular updates and monitoring to ensure the application continues to meet user needs and functions smoothly.



Consider a Java program that performs addition:

```java
import java.util.Scanner;

public class Addition {
    public static void main(String[] args) {
        Scanner scanner = new Scanner(System.in);
        System.out.println("Enter the first number:");
        int x = scanner.nextInt();
        System.out.println("Enter the second number:");
        int y = scanner.nextInt();
        int z = x + y;
        System.out.println("The result is: " + z);
    }
}
```

This program asks the user to enter two numbers, adds them, and displays the result. For this program to be useful to multiple users, you would need to:

- Deploy it on a web server so users can access it via a web interface.
- Ensure the server has enough resources (compute power and storage) to handle user requests.
- Monitor the application for any issues and update it as necessary.

Coding is just the beginning of the software development journey. To create functional and useful applications, you need to consider deployment, infrastructure costs, and ongoing maintenance. Understanding the full lifecycle of software development helps you appreciate the complexities involved and the importance of planning and resource management.

Should We Learn Coding?

Several business executives and tech-savvy individuals have suggested that coding may become obsolete with the advent of AI capable of writing code. They argue that children and students might not need to learn coding as extensively as previous generations did. Nvidia CEO Jensen Huang has stated that children should no longer be encouraged to learn to code because AI might already be jeopardizing careers in the field.

However, there are many who believe coding remains essential. Despite the rise of AI, understanding how to code is still considered a valuable skill. Various experts provide differing opinions, but from my non-specialist perspective, coding is still relevant and necessary.

The Value of Learning to Code

1. Understanding Logic and Algorithms: Learning to code teaches you how to think logically and understand algorithms. This is crucial because, without this foundational knowledge, having AI write code for you would be meaningless. You need to be able to debug, read, and understand the code and its underlying logic to effectively utilize it.

2. Problem-Solving Skills: Coding enhances problem-solving skills. It requires you to break down complex problems into smaller, manageable parts and solve them step by step. This skill is transferable to many areas beyond software development.

3. Adaptability and Flexibility: Technology evolves rapidly, and understanding coding principles makes it easier to adapt to new tools and technologies. Even if AI takes over some coding tasks, the ability to understand and interact with code will remain valuable.

4. Creativity and Innovation: Coding enables creativity and innovation. It allows individuals to build applications, websites, games, and more, bringing their ideas to life. This creative aspect of coding is something AI cannot fully replicate.

5. Job Market Demand: Despite AI advancements, there is still a strong demand for skilled programmers. Many industries require custom software solutions, and understanding coding can provide a competitive edge in the job market.

Expert Opinions

While some industry leaders like Jensen Huang suggest a diminished need for learning to code, others emphasize its continued importance. For instance, educators and technologists argue that coding literacy is as crucial as reading and writing in the digital age. They advocate for coding education to foster a deeper understanding of technology and its applications.

While AI advancements have transformed the coding landscape, learning to code remains a valuable and relevant skill. It equips individuals with critical thinking, problem-solving, and adaptability skills that are essential in various fields. As technology continues to evolve, understanding coding will help individuals navigate and leverage these changes effectively. Therefore, encouraging students to learn coding is still a worthwhile endeavor.


Please cite as:

Thadphoothon, J. (2024). "Coding is just the Beginning: A Personal Perspective" on JT Blog. Available online at: https://janpha.blogspot.com/2024/07/coding-is-just-beginning-personal.html


About Janpha Thadphoothon

Janpha Thadphoothon is an assistant professor of ELT at the International College, Dhurakij Pundit University in Bangkok, Thailand. Janpha Thadphoothon also holds a certificate of Generative AI with Large Language Models issued by DeepLearning.AI.

No comments:

Post a Comment

Time as the Subject: Grammar, Physics, and Philosophy

Time as the Subject: Grammar, Physics, and Philosophy By Janpha Thadphoothon As a language educator, I often find myself intrigued by how ab...