XML 內容篩選器XML content filter
By Young-Chen Chang, November 15, 2018
Translated by GoogleShow original (zh-TW)
服役時所做的專案，目的是幫助局裡的同仁能將一部分關於 xml 檔案的處理自 動化。 當時檔案室候接受了從他局處移交過來的 xml 格式電子檔案，需匯入既有的公文系統。但由於一些原因導致有大量不該存在的檔案混雜在其中需要人工移除，因公文系統龐大且老舊，在進行搜尋和刪除時相當緩慢，導致作業十分困難。 我因此建議他們直接編輯這些 xml 檔，並做出這個網頁介面來加快作業，整個成品趕在退伍前一個月完成。 現在網站正掛載在局裡的內網中。我也利用退伍的這段空閒期間為其進行維護與效能改善，該網站遇到最大的問題是檔案資料很大，每份 xml 可能有將近 3000 筆的檔案資料。前端方面我嘗試用不同的資料結構加上 lazy loading 來改善 state 資料過大更新速度緩慢的問題，後端則利用檔案壓縮和 xml 資料預處理來加速傳輸速度。目前整個專案依然有很多改善空間！
The project that was done during the service was designed to help colleagues in the bureau to automate some of the processing of xml files. At that time, the archives accepted the xml format electronic file handed over from his office and needed to import into the existing document system. However, for a number of reasons, there are a large number of files that should not exist in which manual removal is required. Because the document system is large and old, it is quite slow in searching and deleting, which makes the operation very difficult. I therefore suggested that they edit these xml files directly and make this web interface to speed up the work, and the entire product is completed one month before the retirement. The website is now being mounted on the intranet in the bureau. I also used the period of retirement during the period to provide maintenance and performance improvement. The biggest problem encountered by the website was that the archives were very large. There may be nearly 3,000 files per xml. On the front end, I tried to use different data structures plus lazy loading to improve the slow update of state data. The back end uses file compression and xml data preprocessing to speed up the transfer. At present, the entire project still has a lot of room for improvement!