Dear interviewers and recruiters,
If you are looking for someone who (1) is capable of analyzing real-world problem quantitatively, (2) can design neural network models based on problem features, rather than calling PyTorch API blindly, and (3) having solid mathematical background and computer science training, then I humbly suggest myself. I have always challenged myself to think out side the box, to truly understand a concept, rather than just knowing it. My thinking follows the "GPA" way: Geometrically, Physically, and Algebraically. With all respect, I believe I am able to add credible values to your company, and push our industry to the next level.
Second Year Graduate Student at the Electrical Engineering and Computer Science Department in the National Taiwan University
This was my second year being a teaching assistant for the freshmen of the EECS department in NTU. Different from the last time, I was in charge of leading a group of four TAs. I organized and prioritized tasks for them, and ran our teaching program systematically. I also participated in the problem designing part like the last time.
I am in an optimization team. Currently, our goal is to automate the design process of antennas and integrated circuits. Particularly, I deployed evolutionary algorithms like covariance matrix adaptation evolution strategy (CMA-ES) and simple genetic algorithms (GA) to help optimization. Besides optimization, I sometimes visualize data and write the front-end for our company website.
This is a project held by the Ministry of Science and Technology in Taiwan. I was a member of a five-people team (including the professor). We tackled training a neural network when there are only a limited amount of data. We integrated supervised learning, self-supervised learning, batch pseudo-labeling, Mixup, and Smart Augmentation in our system. In addition, we focused on industrial datasets, e.g., wafer datasets, electrocardiography, etc. We successfully derived a pleasing prediction accuracy using our framework.
This was my first year being a teaching assistant for the freshmen of the EECS department in NTU. I was responsible for designing challenging programming problems for both homework and exams. Besides, I held QA sessions weekly to help students gain a more solid understanding of C/C++ programming. As the saying goes, teaching is the best way to learn. I learned a lot from students while I tried to explain abstract concepts to them.
I was responsible for the training of word vectors for a medical NLP engine. Since we are a small company, I had an end-to-end experience in NLP. To be precise, I scraped data from both Wikipedia and some medical websites (Mandarin corpus) using Python, preprocessed the data (including noise filtering, sentence segmentation, etc), and trained a neural network for a good representation of these words. Finally, these word vectors are used for downstream tasks like medical chat-bot, and intelligent medical QA systems.
I worked and traveled in Kansas city in the US for around three months when I was in my third year of college. I worked as a ride operator in the Worlds of Fun amusement park. I was mainly in charge of operating roller coasters, greeting the tourists, and making sure that everyone had the best day of their life. At the same time, this experience broadened my international horizon and polished my English oral speaking skill.
I am currently in my second year as a graduate student in the Taiwan Evolutional Intelligence Laboratory. My research focuses on optimizing the DMSGA-II algorithm, which is the state-of-the-art black-box optimization algorithm for combinatory optimization. Besides, I have taken courses in computer visions, database systems, machine learning. All these courses make me a qualified me as a well-trained data science engineer.
Four years of mathematical training make me good at mathematical reasoning, deduction, reading proofs, and abstract thinking. My personal interest is in probability theory and statistics. Currently, I teaching myself mathematical finance and stochastic process.
We proposed a graphical model on the functional dependencies of a relational database, called Functional Dependency Graph (FDG). Using FDG, we are capable of finding keys and superkeys of the database based only on the initial dependencies between attributes. In addition, FDG can tell us which normal form our relational database is in. Moreover, FDG can normalize a relational database automatically into the first, second, third, and Boyce–Codd normal form.
My research focuses on optimizing the DSMGAII (arXiv:1807.11669) proposed by our lab in 2015. Currently, I am analyzing and optimizing the Restricted Mixing (RM) operator of DSMGAII. RM is the exploration power of DSMGAII, together with the Back Mixing (BM) operator, they are able to solve nearly decomposable combinatorial optimization problems.