Registration
00

day

00

hr

00

min

00

sec

Time to team-up deadline

0 / 0

participants / team

Register

Topic provider

Graduate Institute of Information Management, National Taipei University, NTPU

The Graduate Institute of Information Management, National Taipei University (NTPU) is dedicated to cultivating professionals in the fields of information management and digital innovation. Its curriculum centers on information technology, equipping students with advanced skills in artificial intelligence, big data analytics, cloud computing, mixed reality, fintech, and blockchain, while also emphasizing key areas such as information security compliance and digital sustainability. Through interdisciplinary practical applications, the program extends into digital marketing, digital transformation, consumer behavior insights, and sports and leisure management, ensuring students can translate technical expertise into tangible industry value.

Academically, the Institute emphasizes a dual focus on research and practical application. Close industry-academia collaborations and cross-disciplinary integration provide students with substantial hands-on experience. The program fosters the ability to explore emerging information technologies, develop system applications, and plan digital marketing strategies, while also cultivating rigorous academic writing and independent research skills. Students joining the Institute will grow into adaptable professionals capable of making a significant impact in information management, smart services, and emerging technology applications.

Introduction

As the United Nations Sustainable Development Goals (SDGs) and global net-zero policies accelerate, ESG disclosure has become a defining measure of corporate credibility. Yet many sustainability reports still contain vague or unverified claims, highlighting an urgent need for intelligent verification tools—especially for Traditional Chinese ESG texts, where language resources remain limited.
The “ESG Sustainability Commitment Verification Competition 2026” invites you to tackle this real-world challenge. Using the VeriPromiseESG4K Annotated Corpus, built from authentic industry scenarios, participants will develop NLP models that identify sustainability commitments and assess their supporting evidence.

This is an opportunity to apply AI to a high-impact global issue, sharpen your skills in ESG analytics and model development, and contribute to greater transparency and trust in corporate sustainability disclosures. This competition provides 4,000 annotated Traditional Chinese ESG entries drawn from real corporate sustainability reports, forming the VeriPromiseESG4K dataset. Participants will apply AI technology such as natural language processing and large language models to develop NLP models capable of automatically identifying, analyzing, and verifying corporate sustainability commitments.
Participating teams will utilize the “VeriPromiseESG4K Annotated Corpus” to develop Natural Language Processing (NLP) models addressing the following four core tasks and generate prediction results for the test dataset:

  • Subtask 1: Commitment Classification
    Determine whether the given text expresses a concrete corporate commitment toward future actions.
  • Subtask 2: Timeline Classification
    Based on semantic inference, determine the expected completion timeframe of the identified commitment (calculated from the publication year 2024 of the ESG report).
  • Subtask 3: Evidence Identification
    Determine whether the identified commitment sentence is supported by action plans or existing records (e.g., data, methodologies, measures, or any concrete implementation details).
  • Subtask 4: Clarity Classification
    If evidence is present, evaluate whether the evidence statement is semantically explicit; easy to understand; free from vague wording (e.g., “continuously advancing,” “striving to improve”).

To facilitate participating teams in successfully initiating their model development projects, the organizer will provide relevant technical support and resources, including:

  • Regional Hands-on Workshops:
    Free in-person workshops will be conducted across three regions, accompanied by parallel online sessions. These sessions will provide detailed explanations of the competition information and demonstrate the use of sample code, guiding participants through fundamental hands-on implementation.
  • Sample Code and Discussion Forum:
    After teams complete official registration and gain access to the competition platform, the organizer will provide the Sample Code and grant access to the official public discussion forum, where participants may engage in discussions and raise questions.
  • Official Inquiry Form:
    If you encounter any questions or difficulties during the competition period and are unable to raise them through the official discussion forum, you may submit your inquiries or concerns via Google Form. Upon submission, the Organizer will provide an official response or appropriate assistance.

Eligibility

  1. Student Division:
    • Individuals who, from the date of registration until the submission deadline for the competition test set predictions, hold official student status in the Republic of China (Taiwan), are at least 18 years of age, and are currently enrolled in an educational institution at any level are eligible to register for this competition.
    • Minors under the age of 18 may participate with the consent of their legal guardian.
    • Relevant supporting documentation (such as a photocopy of both sides of a student ID card or an original certificate of enrollment) must be submitted when awarding prizes.
  2. General Public Division:
  3. If any team member does not meet the eligibility requirements of the Student Division (including students currently enrolled in institutions outside the Republic of China), the entire team will be classified under the General Public Division.

Registration

  1. Each team member must register through the AI CUP Registration System (first-time users may refer to the AI CUP Registration System guide for instructions). Participants must follow the on-screen registration procedures, complete all required information, and finalize team formation to complete competition enrollment. Team names will be automatically assigned by the system; participants are not permitted to choose their own team names.
  2. Within 1–2 business days after completing competition enrollment via the AI CUP Registration System, an account activation notice and password for the AIdea Platform will be sent to the participant’s registered email address. After completing the activation steps indicated in the email, all team members may access the AIdea Platform to participate in the competition.
  3. If participants are registering as part of a course requirement, they should obtain the course code from the course instructor or teaching assistant and enter it during registration. This will enable the instructor to access competition results for course evaluation purposes.
  4. Before registration, each participant please complete the "VeriPromiseESG 2026: Pre-test Survey"; after the competition, each participant please complete the "VeriPromiseESG 2026: Post-test Survey".

For inquiries, please contact: yyteng@mail.ntpu.edu.tw

Prize

(1) Student Division Ranking Award

The top 30 teams in this competition are expected to be winners. After review and approval by the organizing committee, awards will be presented according to the following rankings:

AwardQuotaPrize (Cash Prizes Only for Student Participants)
First Prize1NT$80,000 + A printed certificate from the Ministry of Education
Second Prize1NT$50,000 + A printed certificate from the Ministry of Education 
Third Prize1NT$30,000 + A printed certificate from the Ministry of Education
Excellence Award2NT$10,000 + A printed certificate from the Ministry of Education
Honorable Mention Award10NT$7,000 + A printed certificate from the Ministry of Education

* Note 1: The top 30 teams and the top 25% of the Private Leaderboard (regardless of their status, including members of the public) will receive an additional electronic certificate from the Ministry of Education's Artificial Intelligence Competition Program Office after submitting a report as required and having it reviewed and approved by the organizing committee's judging panel.

* Note 2: The final number of awards may be adjusted based on the number of entries and their scores. If any entry fails to meet the required standard (scores below the baseline set by the judging panel), the number of awards may be reduced or increased depending on the number of entries.

(2) Other Award Claim Guidelines

All winning teams agree to assist the organizers with the following arrangements; otherwise, they will be disqualified from receiving the award.

  1. Ranking Method
    • All participating teams (including students and members of the public) will be ranked uniformly based on their final scores. Only student participants are eligible to receive cash prizes. Prize money for winning teams in the public category will be awarded to the next eligible student team in the rankings.
    • In the event of a replacement team, the replacement team will share the same ranking as the original team.
    • Final scores will be calculated according to the competition's scoring method, and will be determined based on the results of the Private Leaderboard and report review.
  2. Eligibility and Decision-Making Principles
    All winning teams agree to assist the organizers with the following arrangements; otherwise, they will be disqualified from receiving the prize:
    • Prize money will be disbursed via New Taiwan Dollar (NTD) remittance. Winning teams that meet the registration requirements and are eligible to receive the prize money must appoint a team member with a local NTD account to receive the prize money on their behalf. This team member will be the Republic of China tax filer and will be responsible for paying the relevant income tax. All relevant documents, including tax, personal data usage, and receipt documents, must be signed as required.
    • If any member of a team that wins the "Student Ranking Award" has achieved a top-three ranking (including first to third place, or gold/silver/bronze medals) three or more times in the "Ministry of Education National College Artificial Intelligence Competition (AI CUP)" organized by the "Ministry of Education Artificial Intelligence Competition and Annotation Data Collection Project Office," and then wins again in the competition, that team will only receive a certificate from the Ministry of Education, and the prize money will be awarded to the next-ranked team. The two teams will be tied for the same ranking. This rule is not retroactive, and the number of awards is calculated from the Fall 2022 competition.

Activity time

The competition will begin on Thursday, March 5, 2026 (Taiwan Time, UTC+8) and will officially conclude with the announcement of results on Thursday, July 23, 2026. The detailed schedule is as follows:

DateItem
2026/03/05(THU)-2026/04/28(Tue)Registration Opens & Training Set Release
March 2026
(the exact detail will be announced on the official competition website)
Regional Hands-on Workshops:
  1. Northern Region: University of Taipei (UTaipei);
  2. Central Region: Providence University (PU);
  3. Southern Region: National Kaohsiung University of Science and Technology (NKUST).
2026/06/03(Wed)-2026/06/10(Wed)Validation Set Release
2026/06/10(Wed)-2026/06/17(Wed)Test Set Release & Prediction Submission
2026/06/23(Tue)Preliminary Results Announcement Private leaderboard
2026/06/24(Wed)-2026/06/30(Tue)Submission of Additional Deliverables (Report and Code)
2026/07/23(Thu)Final Ranking Announcement
2027/03Award Ceremony(To be determined)

Evaluation Criteria

Participants in VeriPromiseESG 2026 (hereinafter referred to as “the competition”) are required to develop an AI model capable of completing four core tasks, based on three datasets provided by the organizer: the Training Data, Validation Data, and Test Data, along with annotated sample data. The stages of the Competition are as follows:

  • Stage 1:The organizer releases annotated sample data and opens registration.
  • Stage 2:Participating teams train their models using the Training Data and generate predictions on the Validation Data, uploading results to the online Public Leaderboard.
  • Stage 3:After the organizer releases the full Private Dataset, teams must upload their predictions for the Test Data to the platform before the deadline. Each team may submit up to three times per day.
  • Stage 4:At the close of the Competition, final rankings will be determined based on results from the Private Dataset and announced on the Private Leaderboard.

Final rankings will be calculated using a weighted composite score across the four tasks to evaluate overall system performance (see “Task Evaluation Criteria” and “Evaluation Formula” for details).

All teams must submit the required technical report and original source code within the specified timeline to verify the absence of manual adjustments, misconduct, or plagiarism. Teams that fail to submit the required materials on time will not be included in the final ranking.

Task Evaluation Criteria

The judging panel, appointed by the organizer and composed of industry professionals and academic experts, will calculate the final score using a weighted average of the four subtasks:

Evaluation CriteriaDescriptionWeight
Commitment Classification
(Evaluation metrics: F1-Score)
Balance between precision and recall in identifying ESG commitment statements20%
Timeline Classification
(Evaluation metrics: Macro-F1 Score)
Four-class classification performance for predicting appropriate verification timing15%
Evidence Identification
(Evaluation metrics: F1-Score)
Ability to determine whether commitments are sufficiently supported by evidence30%
Clarity Classification
(Evaluation metrics: Macro-F1 Score)
Three-class classification performance for evaluating evidence quality35%

Evaluation Formula

$$\begin{aligned} \textit{Total Score}=(\textit{Commitment Classification F1 Score}×0.20)\\+(\textit{Evidence Identification F1 Score}×0.30)\\+(\textit{Clarity Classification F1 Score}×0.35)\\+(\textit{Timeline Classification F1 Score}×0.15) \end{aligned}$$

Rules

All teams must carefully read the following provisions. In the event of disputes regarding rights or violations, the organizer reserves the right to revoke participation or award eligibility. Teams shall bear full responsibility for any consequences. If prizes have already been awarded, the organizer reserves the right to reclaim them.

  1. The total number of daily submissions for the Public Dataset and Private Dataset predictions is limited to 3 per day. The system will display the highest-scoring submission on the Leaderboard.
  2. The organizer reserves the right to adjust the dataset during the competition if necessary.
  3. Without prior notice, the organizer may disqualify any team under the following circumstances:
    • Verified evidence of plagiarism, cheating, or fraud;
    • Infringement of intellectual property rights;
    • Attacks on the leaderboard system;
    • Actions affecting other teams and compromising fairness;
    • Violations of competition regulations, the terms of use of the "AI CUP Registration System" or the terms of use of the "AIdea Platform".
  4. All reports, code, and prediction results must be submitted before the deadline. Late submissions, revisions, or supplementary submissions will not be accepted. Only teams that complete the submission within the deadline will be ranked. Failed or incomplete submissions will be considered a withdrawal.
  5. Manual annotation or modification of the released test dataset is strictly prohibited. Any manual corrections to the identification results are also prohibited. All predictions must be automatically generated by programs to ensure fairness. However, the impact of self-made data or open-source resources on the model and their analysis can be discussed in the written report submitted at the end of the competition.
  6. Teams may not privately share code or features, Post-competition reports should also be written independently. If different teams submit reports with similar formats, it will affect the judging results of all involved teams. In serious cases, the teams will be disqualified from winning awards. However, this does not apply to content discussed publicly in the official discussion forum that is used as a reference by other teams.
  7. Teams must respect all judging decisions and final results.
  8. All intellectual property rights related to submitted materials remain with the teams; however, teams grant the organizer a royalty-free license for promotional, archival, and exhibition purposes.
  9. By completing registration, teams agree to comply with all competition rules and regulations.
  10. The organizer reserves the right to interpret, amend, suspend, or terminate the competition rules if necessary.