|
109 | 109 | </item> |
110 | 110 |
|
111 | 111 | <item> |
112 | | - <title>Open-topic PhD fellowships available as part of DARA</title> |
113 | | - <link>https://copenlu.github.io/news/open-topic-phd-fellowships-available-as-part-of-dara/</link> |
| 112 | + <title>PhD fellowships for start in Spring or Autumn 2026</title> |
| 113 | + <link>https://copenlu.github.io/news/phd-fellowships-for-start-in-spring-or-autumn-2026/</link> |
114 | 114 | <pubDate>Thu, 26 Jun 2025 00:00:00 +0000</pubDate> |
115 | 115 |
|
116 | | - <guid>https://copenlu.github.io/news/open-topic-phd-fellowships-available-as-part-of-dara/</guid> |
117 | | - <description><p>The Danish Advanced Research Academy (DARA) is <a href="https://daracademy.dk/fellowship/fellowships-summer-2025">calling for PhD fellowship applications on topics including AI</a>. |
118 | | -As the fellowship application process requires a letter of support from the prospective main supervisor, we are collecting and screening expressions of interest of candidates who would like to apply with Isabelle Augenstein as a main supervisor <a href="https://forms.office.com/e/HZSmgR9nXB">here</a>. Selected interested candidates will be invited to submit full applications to the DARA fellowship programme with my support. |
119 | | -Note that the PhD programme requires applicants to hold a Master&rsquo;s degree or be in the process of completing a Master&rsquo;s programme.</p> |
| 116 | + <guid>https://copenlu.github.io/news/phd-fellowships-for-start-in-spring-or-autumn-2026/</guid> |
| 117 | + <description> |
120 | 118 |
|
121 | | -<p>The timeline for recruitment is as follows:</p> |
| 119 | +<p>Would you like to join our lab as a PhD student in 2026? We have several openings. Read more about reasons to join CopeNLU <a href="https://copenlu.github.io/post/why-ucph/">here</a>.</p> |
122 | 120 |
|
123 | | -<ul> |
124 | | -<li>26 June - 20 July 2025: submission of expressions of interest via <a href="https://forms.office.com/e/HZSmgR9nXB">this form</a></li> |
125 | | -<li>21 July - 9 August 2025: shortlisting and screening interviews</li> |
126 | | -<li>10 August 2025: a small number of selected candidates are informed</li> |
127 | | -<li>10 - 28 August 2025: each candidate prepares DARA fellowship materials with my support</li> |
128 | | -<li>29 August 14:00 CEST: deadline for submission of DARA fellowship materials</li> |
129 | | -<li>December 2025: DARA fellowship notification</li> |
130 | | -<li>1 February - 15 June 2026: start of PhD</li> |
131 | | -</ul> |
| 121 | +<h2 id="start-in-spring-2026">Start in Spring 2026</h2> |
| 122 | + |
| 123 | +<p>A fully funded 3-year PhD fellowship on <strong>explainable natural language understanding</strong> for a start in <strong>Spring 2026</strong> is available as part of the &lsquo;<a href="https://erc.europa.eu/news/erc-2021-starting-grants-results">ExplainYourself project</a> on Explainable and Robust Automatic Fact Checking. The position requires a Master&rsquo;s degree. The successful candidate will be supervised by <a href="http://isabelleaugenstein.github.io/">Isabelle Augenstein</a> and co-supervised by <a href="https://apepa.github.io/">Pepa Atanasova</a>. |
| 124 | +Read more about the position and apply <a href="https://candidate.hr-manager.net/ApplicationInit.aspx/?cid=1307&departmentId=18970&ProjectId=164789&MediaId=5&SkipAdvertisement=false">here</a> by <strong>31 October 2025</strong>.</p> |
| 125 | + |
| 126 | +<p>The project is funded by an ERC Starting Grant, a highly competitive funding program by the <a href="https://erc.europa.eu/homepage">European Research Council</a> which supports the most talented early-career scientists in Europe with funding for a period of 5 years for blue-skies research to build up or expand their research groups.</p> |
| 127 | + |
| 128 | +<p>ExplainYourself proposes to study explainable automatic fact checking, the task of automatically predicting the veracity of textual claims using machine learning (ML) methods, while also producing explanations about how the model arrived at the prediction. Automatic fact checking methods often use opaque deep neural network models, whose inner workings cannot easily be explained. Especially for complex tasks such as automatic fact checking, this hinders greater adoption, as it is unclear to users when the models&rsquo; predictions can be trusted. Existing explainable ML methods partly overcome this by reducing the task of explanation generation to highlighting the right rationale. While a good first step, this does not fully explain how a ML model arrived at a prediction. For knowledge intensive natural language understanding (NLU) tasks such as fact checking, a ML model needs to learn complex relationships between the claim, multiple evidence documents, and common sense knowledge in addition to retrieving the right evidence. There is currently no explainability method that aims to illuminate this highly complex process. In addition, existing approaches are unable to produce diverse explanations, geared towards users with different information needs. ExplainYourself radically departs from existing work in proposing methods for explainable fact checking that more accurately reflect how fact checking models make decisions, and are useful to diverse groups of end users. It is expected that these innovations will apply to explanation generation for other knowledge-intensive NLU tasks, such as question answering or entity linking.</p> |
| 129 | + |
| 130 | +<p>In addition to the principle investigator, PhD students and postdocs, the project team includes collaborators from CopeNLU as well as external collaborators. Two PhD students as well as a postdoc have already been recruited as a result of earlier calls, and the project officially <a href="https://copenlu.github.io/talk/2023_09_explainyourself/">kicked off in September 2023</a>.</p> |
| 131 | + |
| 132 | +<h2 id="start-in-autumn-2026">Start in Autumn 2026</h2> |
| 133 | + |
| 134 | +<p>For a start in <strong>Autumn 2026</strong>, we are considering candidates on any topic aligned with the <a href="https://www.copenlu.com/#projects">focus areas of our lab</a>. Candidates should express their interest by applying to the <a href="https://ellis.eu/news/ellis-phd-program-call-for-applications-2025">ELLIS PhD programme</a> by <strong>31 October 2025</strong>, naming Isabelle Augenstein as a supervisor. ELLIS is a pan-European recruitment vehicle for PhD students and does not provide funded PhD fellowships, though it offers networking opportunities, and opportunities to obtain travel funding.</p> |
132 | 135 |
|
133 | | -<p>Read more about reasons to join CopeNLU <a href="https://copenlu.github.io/post/why-ucph/">here</a>. The official call for the PhD fellowships is up <a href="https://daracademy.dk/fellowship/fellowships-summer-2025">here</a>.</p> |
| 136 | +<p>Successful candidates will be supported in applying for funded fellowship opportunities, including those offered by the <a href="https://ddsa.dk/about-ddsa-fellowship-programme/">Danish Data Science Academy (DDSA)</a> and the <a href="https://www.daracademy.dk/fellowships">Danish Advanced Research Academy (DARA)</a>. Additional fully funded PhD positions may become available through the <a href="https://www.aicentre.dk/jobs">Pioneer Centre for AI</a>. Candidates without Master&rsquo;s degrees may be eligible.</p> |
134 | 137 | </description> |
135 | 138 | </item> |
136 | 139 |
|
|
0 commit comments