<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Edan Weis]]></title><description><![CDATA[Passion, integrity, learning]]></description><link>https://edanweis.co/</link><generator>Ghost 5.80</generator><lastBuildDate>Fri, 01 May 2026 04:13:58 GMT</lastBuildDate><atom:link href="https://edanweis.co/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[SuperSeeded]]></title><description><![CDATA[<p>SuperSeeded optimises the commercial landscape market to reduce supply risk for growers, save time and earn money for the nature professions.</p><p>Get in touch </p><figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/-FKa-4NXfic?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen title="SuperSeeded.ai"></iframe></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--5-.png" class="kg-image" alt loading="lazy" width="960" height="540" srcset="https://edanweis.co/content/images/size/w600/2025/05/SuperSeeded-Pitch-Deck--5-.png 600w, https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--5-.png 960w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--2-.png" class="kg-image" alt loading="lazy" width="960" height="540" srcset="https://edanweis.co/content/images/size/w600/2025/05/SuperSeeded-Pitch-Deck--2-.png 600w, https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--2-.png 960w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--3-.png" class="kg-image" alt loading="lazy" width="960" height="540" srcset="https://edanweis.co/content/images/size/w600/2025/05/SuperSeeded-Pitch-Deck--3-.png 600w, https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--3-.png 960w" sizes="(min-width: 720px) 720px"></figure>]]></description><link>https://edanweis.co/superseeded/</link><guid isPermaLink="false">68298c60ffe77217a4532b0b</guid><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sun, 18 May 2025 07:36:49 GMT</pubDate><media:content url="https://edanweis.co/content/images/2025/05/013.png" medium="image"/><content:encoded><![CDATA[<img src="https://edanweis.co/content/images/2025/05/013.png" alt="SuperSeeded"><p>SuperSeeded optimises the commercial landscape market to reduce supply risk for growers, save time and earn money for the nature professions.</p><p>Get in touch </p><figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/-FKa-4NXfic?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen title="SuperSeeded.ai"></iframe></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--5-.png" class="kg-image" alt="SuperSeeded" loading="lazy" width="960" height="540" srcset="https://edanweis.co/content/images/size/w600/2025/05/SuperSeeded-Pitch-Deck--5-.png 600w, https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--5-.png 960w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--2-.png" class="kg-image" alt="SuperSeeded" loading="lazy" width="960" height="540" srcset="https://edanweis.co/content/images/size/w600/2025/05/SuperSeeded-Pitch-Deck--2-.png 600w, https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--2-.png 960w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--3-.png" class="kg-image" alt="SuperSeeded" loading="lazy" width="960" height="540" srcset="https://edanweis.co/content/images/size/w600/2025/05/SuperSeeded-Pitch-Deck--3-.png 600w, https://edanweis.co/content/images/2025/05/SuperSeeded-Pitch-Deck--3-.png 960w" sizes="(min-width: 720px) 720px"></figure>]]></content:encoded></item><item><title><![CDATA[Counting with your eyes]]></title><description><![CDATA[<p>I invented a new way of counting visually patterned items with your eyes</p><p>I invented a new method of counting things with your eyes, &quot;bisection counting&quot; of periodic setructures. The idea is to use your eye to spot the midpoint of a linear/radial group of things (eg:</p>]]></description><link>https://edanweis.co/counting-with-your-eyes/</link><guid isPermaLink="false">68297f5affe77217a4532ac3</guid><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sun, 18 May 2025 07:22:39 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1563993356056-b23a9cd265ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wxMTc3M3wwfDF8c2VhcmNofDR8fHJvb2YlMjB0aWxlc3xlbnwwfHx8fDE3NDc1NTMyMjZ8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1563993356056-b23a9cd265ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wxMTc3M3wwfDF8c2VhcmNofDR8fHJvb2YlMjB0aWxlc3xlbnwwfHx8fDE3NDc1NTMyMjZ8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=2000" alt="Counting with your eyes"><p>I invented a new way of counting visually patterned items with your eyes</p><p>I invented a new method of counting things with your eyes, &quot;bisection counting&quot; of periodic setructures. The idea is to use your eye to spot the midpoint of a linear/radial group of things (eg: fence palings, wheel spokes, or tiles), then keep halving the group again and again until you&apos;re down to just one (or nearest to one) item. Raise 2 to the power of halves you counted and you get a pretty fast and accurate estimate.</p><p></p><p><strong>Calculate the Total</strong>:</p><p>Use the formula:</p><p>\[\text{Estimated Total} = 2^{\text{Number of Halvings}}\]</p><p>So if you halved 4 times:</p><p>\[\text{Estimated&#xA0;Total}=2^4=16\]</p><h3 id="error-bound-estimation">Error Bound estimation</h3><ul><li>You visually estimate \(n\) halvings</li><li>Your total estimate is \(2^n\)</li><li>The actual count is \(N\)</li></ul><p>The maximum relative error occurs between powers of two:</p><p>If \(N\) is between \(2^n\) and \(2^n+1\), and you guessed \(n\), then:</p><p>\[ \text{Relative error} = \frac{|2^n - N|}{N}\]</p><p>Which means the <strong>maximum error</strong> is just under 50% (specifically, ~41% at the midpoint between powers of two).</p><p>In practice, the actual error would be much smaller, because:</p><ul><li>Users rarely misjudge an entire halving level.</li><li>The uncertainty is <strong>usually confined to the last halving</strong>, which only introduces small errors, not a full &#xB1;1 in \(N\).</li></ul><p>So typical errors would be lower, around 5&#x2013;15%</p>]]></content:encoded></item><item><title><![CDATA[Reverse engineering a soil calculator using symbolic regression]]></title><description><![CDATA[<figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://www.elkeh.com.au/soils/?ref=edanweis.co"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Soils - Elke Haege</div><div class="kg-bookmark-description">Elke, together with soil scientist Simon Leake, SESL Australia (www.sesl.com.au) are about to launch an exciting publication through CSIRO titled: &#x201C;Soils for Landscape development: Selection, Specification and Validation&#x201D; .</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://www.elkeh.com.au/wp-content/uploads/2016/01/cropped-elke-logov2-270x270.png" alt><span class="kg-bookmark-author">Elke Haege Landscape Architecture + Consulting Arborist</span></div></div><div class="kg-bookmark-thumbnail"><img src="http://www.elkeh.com.au/wp-content/uploads/2013/04/soils-for-landscape-development-cover-image-elke-haege-simon-leake.jpg" alt></div></a><figcaption><p><span style="white-space: pre-wrap;">The Elke Haege soil simulator</span></p></figcaption></figure><blockquote>This post</blockquote>]]></description><link>https://edanweis.co/reverse-engineering-a-calculator-with-symbolic-regression/</link><guid isPermaLink="false">6625068e9ec77f6d2fc8c8b8</guid><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sun, 18 May 2025 06:28:45 GMT</pubDate><media:content url="https://edanweis.co/content/images/2025/05/3m2mwrpjq5rmc0cpw8bbk4qsqg.jpg" medium="image"/><content:encoded><![CDATA[<figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://www.elkeh.com.au/soils/?ref=edanweis.co"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Soils - Elke Haege</div><div class="kg-bookmark-description">Elke, together with soil scientist Simon Leake, SESL Australia (www.sesl.com.au) are about to launch an exciting publication through CSIRO titled: &#x201C;Soils for Landscape development: Selection, Specification and Validation&#x201D; .</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://www.elkeh.com.au/wp-content/uploads/2016/01/cropped-elke-logov2-270x270.png" alt="Reverse engineering a soil calculator using symbolic regression"><span class="kg-bookmark-author">Elke Haege Landscape Architecture + Consulting Arborist</span></div></div><div class="kg-bookmark-thumbnail"><img src="http://www.elkeh.com.au/wp-content/uploads/2013/04/soils-for-landscape-development-cover-image-elke-haege-simon-leake.jpg" alt="Reverse engineering a soil calculator using symbolic regression"></div></a><figcaption><img src="https://edanweis.co/content/images/2025/05/3m2mwrpjq5rmc0cpw8bbk4qsqg.jpg" alt="Reverse engineering a soil calculator using symbolic regression"><p><span style="white-space: pre-wrap;">The Elke Haege soil simulator</span></p></figcaption></figure><blockquote>This post is based on the work of Elke Haege Thorvaldson and Simon Leake (https://www.google.com/search?q=elkeh.com.au and SESL Australia). Elke Haege Thorvaldson is the owner of the calculator, and is an award-winning landscape architect and consulting arborist. The soil calculator, also known as the Soil Volume Simulator (SVS), was a collaborative effort between her and Simon Leake, a soil scientist from SESL Australia</blockquote><h3 id="improving-the-calculator-with-continuous-variables"><strong>Improving the calculator with continuous variables</strong></h3><p>This fantastic simulator generates soil volume requirements for trees based on their &quot;designed size&quot;. The simulator includes surrounding soil, shared root zones, climatic conditions, maintenance, etc.</p><p>Strangely, discrete categories are used for things that have real measurements:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/04/image-2.png" class="kg-image" alt="Reverse engineering a soil calculator using symbolic regression" loading="lazy" width="904" height="336" srcset="https://edanweis.co/content/images/size/w600/2024/04/image-2.png 600w, https://edanweis.co/content/images/2024/04/image-2.png 904w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Real units are input as discrete categories, but why?</span></figcaption></figure><p>A number of interesting things can be done to improve it: </p><ol><li>Increase precision by using continuous inputs instead of discrete ones</li><li>Find alternative ways to achieve required soil volumes by allowing some variables to be &quot;driven&quot; by others that are fixed and unchangeable.</li><li>Discover trade-offs among factors that can be combined to reduce time/cost to achieve required soil volume</li><li>Discover the key factors contributing to the volumes; what mattered the most?</li><li>Use symbolic regression to derive a simpler equation that could be easier to remember and apply from memory. </li><li>Develop a new interface that is more fluid and fun to use.</li></ol><p></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/04/image.png" class="kg-image" alt="Reverse engineering a soil calculator using symbolic regression" loading="lazy" width="1056" height="1596" srcset="https://edanweis.co/content/images/size/w600/2024/04/image.png 600w, https://edanweis.co/content/images/size/w1000/2024/04/image.png 1000w, https://edanweis.co/content/images/2024/04/image.png 1056w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">A snippet of the Elke soil simulator</span></figcaption></figure><h3 id="how-i-reverse-engineered-the-simulator">How I reverse engineered the simulator</h3><p>The first task was to get tabular data from the calculator by inputting all possible combinations. </p><p>What didn&apos;t work:</p><ul><li>Browser automation to click all combinations of inputs using puppeteer to compile a csv</li><li>Duplicating the Wordpress plugin and environment locally to examine its construction.</li><li>Asking GPT-4 to produce an equation using all the factors</li></ul><p>Apart from basic operators + - &#xD7; that calculate answers to values input through  drop-downs, there are sets of numbers conditional upon the initial Tree<em> Design Size and Height</em> input.</p><p>Since there are between 3 and 5 choices across 8 inputs multiplied by the four  <em>Tree Design Size and Height  </em>options available, the Cartesian product yields 23,040  combinations of inputs to get all possible soil volumes.</p><p>I felt that underneath this calculator, there were more interesting things to discover:</p><ul><li>Are there trade-offs between some factors that could shortlist an optimal set of candidates combinations?</li><li>What were the main factors contributing to the volumes? what mattered the least?</li><li>How to lock some factors and let others be driving?</li><li>Could I train a machine learning model to predict soil volumes allowing any range of continuous variables unconstrained by the current options.</li></ul><p>Examining the source code revealed 34 &quot;elements&quot; in an obscure data structure used by the Wordpress plugin. </p><p>Using LLMs to reverse engineer formulas is a novel idea which doesn&apos;t always work, but sometimes it did:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/04/image-1.png" class="kg-image" alt="Reverse engineering a soil calculator using symbolic regression" loading="lazy" width="1138" height="882" srcset="https://edanweis.co/content/images/size/w600/2024/04/image-1.png 600w, https://edanweis.co/content/images/size/w1000/2024/04/image-1.png 1000w, https://edanweis.co/content/images/2024/04/image-1.png 1138w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Using LLMs to reverse engineer formulas</span></figcaption></figure><p>With some prompting and manual effort, I was able to intuit my way to several correct equations. </p><p>However, almost all inputs were mapped to discrete categorical variables in a way that further obscured the underlying equations</p><p>The values for <em>Tree Design Size and Height</em> are scaled away from empirical values (eg: small to 4m = 6.0; small/medium 4-9m high = 10.0)</p><p>In my symbolic regression analysis of the tree soil calculator, I discovered that while the original interface used discrete categories, the underlying math revealed continuous relationships between variables. Using PySR (Symbolic Regression), I identified that tree height is the dominant predictor of soil volume requirements, with a near-linear relationship (\(y&#x2081; = 1.6387 * text{tree_height}\)) for total soil volume. For tree pit volume specifically, the algorithm uncovered a more complex formula incorporating soil suitability, maintenance factors, and planting replacement time, but still primarily driven by tree height. The percentage of soil volume outside the tree pit inversely affects the tree pit volume requirement, acting as a key modifying factor.</p><p>These equations demonstrate why the calculator could be improved by accepting continuous measurements rather than discrete categories. By extracting these mathematical relationships, I&apos;ve created a more flexible model that allows for precise calculations with any input values, eliminating the constraints of the original categorical approach while preserving the essential relationships between variables.</p><h3 id="symbolic-regression-with-pysr">Symbolic Regression with PySR</h3><p></p><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2024/04/ezgif.com-optimize--1-.gif" class="kg-image" alt="Reverse engineering a soil calculator using symbolic regression" loading="lazy" width="1920" height="1440" srcset="https://edanweis.co/content/images/size/w600/2024/04/ezgif.com-optimize--1-.gif 600w, https://edanweis.co/content/images/size/w1000/2024/04/ezgif.com-optimize--1-.gif 1000w, https://edanweis.co/content/images/size/w1600/2024/04/ezgif.com-optimize--1-.gif 1600w, https://edanweis.co/content/images/2024/04/ezgif.com-optimize--1-.gif 1920w" sizes="(min-width: 720px) 720px"></figure>]]></content:encoded></item><item><title><![CDATA[Responsible AI]]></title><description><![CDATA[How can we improve the usability of Responsible AI patterns with causal graph modeling? Based on work by CSIRO Data61.]]></description><link>https://edanweis.co/responsible-ai-which-levers-do-we-pull/</link><guid isPermaLink="false">65e2cba76be2e3052268de77</guid><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Mon, 04 Mar 2024 11:25:50 GMT</pubDate><media:content url="https://edanweis.co/content/images/2024/03/image.jpeg" medium="image"/><content:encoded><![CDATA[<h2 id="which-levers-do-we-pull">Which levers do we pull?</h2><h3 id></h3><img src="https://edanweis.co/content/images/2024/03/image.jpeg" alt="Responsible AI"><p>In 2023, CSIRO published <em>Responsible AI: Best Practices for Creating Trustworthy AI Systems. </em></p><p>The book is a compilation of guidelines to assist in implementing governance, process and product considerations into AI systems:</p><p></p>
<!--kg-card-begin: html-->
<img src="https://edanweis.co/content/images/2024/03/image-26.png" width="920px" style="margin-top:0; mix-blend-mode: multiply; margin-bottom: -40px;" alt="Responsible AI">
<!--kg-card-end: html-->
<blockquote>&quot;Responsible AI is the practice of developing and using AI systems in a way that provides benefits to individuals, groups, and wider society, while minimizing the risk of negative consequences.&quot; (Xu, Whittle, Zhu, &amp; Lu, 2023, p. 18).</blockquote><p>How can we make use of this fantastic resource to support development teams, consultants and governing bodies? I&apos;m going to attempt to improve on efforts to operationalise research by the team at CSIRO Data61.[[2]]</p><h3 id="improving-the-usability-of-rai-patterns">Improving the usability of RAI patterns</h3><p>We start with the notion that product, process and governance practices are causally interrelated and complex.</p><p>This departs from most conceptualisations of RAI patterns as containment hierarchies or trees:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/03/image-2.png" class="kg-image" alt="Responsible AI" loading="lazy" width="1524" height="696" srcset="https://edanweis.co/content/images/size/w600/2024/03/image-2.png 600w, https://edanweis.co/content/images/size/w1000/2024/03/image-2.png 1000w, https://edanweis.co/content/images/2024/03/image-2.png 1524w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Authors conceptualise RAI primarily in terms of containment and hierarchy</span></figcaption></figure><p>The hierarchies above cannot adequately express complexity&#x2014;instead we can model RAI practices as a directed graph of causal connections:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/03/image-3.png" class="kg-image" alt="Responsible AI" loading="lazy" width="490" height="314"><figcaption><span style="white-space: pre-wrap;">A more helpful structure to understand complexity: Directed Graphs</span></figcaption></figure><p>Why use a graph instead of organising content by theme?</p><ol><li>Actions toward RAI can be interpreted based on their cause and effect</li><li>RAI patterns can be re-organised to support actions with the maximum impact.</li></ol><p>By finding reinforcing feedback loops in our graph, we can prioritise actions that accelerate accountability, transparency, reliability and safety&#x2014;or alternatively, balancing loops can limit or goal-seek to mitigate risk or harm, etc.</p><h3 id="inferring-the-causal-structure-of-rai">Inferring the causal structure of RAI</h3><p>Thankfully the written compilation of patterns is already well structured[[3]], so with some data wrangling, it can be converted into a dataframe:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/03/image-4.png" class="kg-image" alt="Responsible AI" loading="lazy" width="2000" height="452" srcset="https://edanweis.co/content/images/size/w600/2024/03/image-4.png 600w, https://edanweis.co/content/images/size/w1000/2024/03/image-4.png 1000w, https://edanweis.co/content/images/size/w1600/2024/03/image-4.png 1600w, https://edanweis.co/content/images/2024/03/image-4.png 2000w" sizes="(min-width: 720px) 720px"><figcaption><b><strong style="white-space: pre-wrap;">Source</strong></b><span style="white-space: pre-wrap;">: Responsible AI: Best Practices for Creating Trustworthy AI Systems</span></figcaption></figure><p>How can we convert this into a graph?</p><p>We can use a Large Language Model (LLM) and some procedural prompt engineering to follow constraints and conditions. Some common techniques were used to create an effective prompt:</p><ul><li>Chain-of-thought (COT) [[4]]</li><li>EmotionPrompt [[5]]</li><li>Expert prompting [[6]]</li><li>Generated knowledge [[7]]</li><li>Least to most [[8]]</li></ul><p>Below is a <a href="https://gist.github.com/edanweis/571acfea31599827a02fb4e31bd7bf52?ref=edanweis.co" rel="noreferrer">snippet</a> of the main function only &#x2014; making use of the fantastic&#xFE0F; Python package <a href="https://lmql.ai/?ref=edanweis.co">LMQL</a> and <code>chat-gpt 3.5 turbo instruct</code> for the LLM model.</p>
<!--kg-card-begin: html-->
<img src="https://edanweis.co/content/images/2024/03/New-Project.webp" width="920px" style="margin-top:-20px; mix-blend-mode: multiply; margin-bottom: -50px;" alt="Responsible AI">
<!--kg-card-end: html-->
<p>Unsurprisingly the results are typical of all LLMs to date; impressive and imperfect. But this provides a useful starting point for the work of skilled modelers. </p><p>The graph below is generated from the chapter <strong>G.4. Regulatory Sandbox</strong> (Xu, Whittle, Zhu, &amp; Lu, 2023, p. 65). In the graph, both balancing and reinforcing loops are present and although most loops seem plausible, the intermediate causal factors between nodes can be absent.</p><h6 id="regulatory-sandbox">Regulatory Sandbox</h6><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/03/image-10.png" class="kg-image" alt="Responsible AI" loading="lazy" width="1948" height="1790" srcset="https://edanweis.co/content/images/size/w600/2024/03/image-10.png 600w, https://edanweis.co/content/images/size/w1000/2024/03/image-10.png 1000w, https://edanweis.co/content/images/size/w1600/2024/03/image-10.png 1600w, https://edanweis.co/content/images/2024/03/image-10.png 1948w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">A causal loop diagram derived from the &quot;Regulatory Sandbox&quot; pattern, (Xu, Whittle, Zhu, &amp; Lu, 2023, p. 65).</span></figcaption></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2024/03/image-14.png" class="kg-image" alt="Responsible AI" loading="lazy" width="1375" height="400" srcset="https://edanweis.co/content/images/size/w600/2024/03/image-14.png 600w, https://edanweis.co/content/images/size/w1000/2024/03/image-14.png 1000w, https://edanweis.co/content/images/2024/03/image-14.png 1375w" sizes="(min-width: 720px) 720px"></figure><h3 id="looking-for-leverage">Looking for leverage</h3><p>Let&apos;s discover the feedback loops that may reveal actions or events that are mutually reinforcing, or balancing with &#x201C;goal-seeking&#x201D; behaviour. </p><p>Remember, these causal factors (dots) are present <strong><em>across different patterns</em></strong>, and so merging them uncovers even more feedback loops!  </p><p>First a lovely hairball appears when aggregating all RAI factors<br></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/03/image-20.png" class="kg-image" alt="Responsible AI" loading="lazy" width="1962" height="1956" srcset="https://edanweis.co/content/images/size/w600/2024/03/image-20.png 600w, https://edanweis.co/content/images/size/w1000/2024/03/image-20.png 1000w, https://edanweis.co/content/images/size/w1600/2024/03/image-20.png 1600w, https://edanweis.co/content/images/2024/03/image-20.png 1962w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Hairballs are common with large-scale graphs, and need to be simplified, clustered and filtered to be useful</span></figcaption></figure><p>To reduce the number of nodes, we can merge them based on duplicates or similar meaning. In the scatter plot below, I used word vector embeddings to show causal factors across RAI patterns (dots) clustered by similar meaning/context. I use the <a href="https://github.com/YingfanWang/PaCMAP?ref=edanweis.co" rel="noreferrer">PaCMAP</a> dimensional reduction and cosine similarity with <code>sentence-transformers/all-mpnet-base-v2</code> embeddings model.</p><p>Hover to see the corresponding text:</p>
<!--kg-card-begin: html-->
<iframe src="https://edanweis.co/content/files/2024/03/merged_clusters_3.html" style="margin-left: -20px; height: 500px; width: 100%; overflow: hidden; border: none;" scrolling="no" allowfullscreen="false" frameborder="0"></iframe>

<!--kg-card-end: html-->
<p>The next step is to find the feedback loops. We can use a graph algorithm to find them for us using <a href="https://networkx.org/?ref=edanweis.co" rel="noreferrer">networkx</a>; the <code>simple_cycles</code> function:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/03/image-27.png" class="kg-image" alt="Responsible AI" loading="lazy" width="1290" height="460" srcset="https://edanweis.co/content/images/size/w600/2024/03/image-27.png 600w, https://edanweis.co/content/images/size/w1000/2024/03/image-27.png 1000w, https://edanweis.co/content/images/2024/03/image-27.png 1290w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">A simple function that finds feedback loops</span></figcaption></figure><p>If we merge based on these clusters and duplicate names, we can use graph algorithm reinforcing or balancing feedback loops.</p><figure class="kg-card kg-video-card kg-width-regular kg-card-hascaption" data-kg-thumbnail="https://edanweis.co/content/media/2024/03/RAI-2_thumb.jpg" data-kg-custom-thumbnail>
            <div class="kg-video-container">
                <video src="https://edanweis.co/content/media/2024/03/RAI-2.webm" poster="https://img.spacergif.org/v1/1962x1876/0a/spacer.png" width="1962" height="1876" loop autoplay muted playsinline preload="metadata" style="background: transparent url(&apos;https://edanweis.co/content/media/2024/03/RAI-2_thumb.jpg&apos;) 50% 50% / cover no-repeat;"></video>
                <div class="kg-video-overlay">
                    <button class="kg-video-large-play-icon" aria-label="Play video">
                        <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                            <path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/>
                        </svg>
                    </button>
                </div>
                <div class="kg-video-player-container kg-video-hide">
                    <div class="kg-video-player">
                        <button class="kg-video-play-icon" aria-label="Play video">
                            <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                                <path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/>
                            </svg>
                        </button>
                        <button class="kg-video-pause-icon kg-video-hide" aria-label="Pause video">
                            <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                                <rect x="3" y="1" width="7" height="22" rx="1.5" ry="1.5"/>
                                <rect x="14" y="1" width="7" height="22" rx="1.5" ry="1.5"/>
                            </svg>
                        </button>
                        <span class="kg-video-current-time">0:00</span>
                        <div class="kg-video-time">
                            /<span class="kg-video-duration">0:18</span>
                        </div>
                        <input type="range" class="kg-video-seek-slider" max="100" value="0">
                        <button class="kg-video-playback-rate" aria-label="Adjust playback speed">1&#xD7;</button>
                        <button class="kg-video-unmute-icon" aria-label="Unmute">
                            <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                                <path d="M15.189 2.021a9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h1.794a.249.249 0 0 1 .221.133 9.73 9.73 0 0 0 7.924 4.85h.06a1 1 0 0 0 1-1V3.02a1 1 0 0 0-1.06-.998Z"/>
                            </svg>
                        </button>
                        <button class="kg-video-mute-icon kg-video-hide" aria-label="Mute">
                            <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                                <path d="M16.177 4.3a.248.248 0 0 0 .073-.176v-1.1a1 1 0 0 0-1.061-1 9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h.114a.251.251 0 0 0 .177-.073ZM23.707 1.706A1 1 0 0 0 22.293.292l-22 22a1 1 0 0 0 0 1.414l.009.009a1 1 0 0 0 1.405-.009l6.63-6.631A.251.251 0 0 1 8.515 17a.245.245 0 0 1 .177.075 10.081 10.081 0 0 0 6.5 2.92 1 1 0 0 0 1.061-1V9.266a.247.247 0 0 1 .073-.176Z"/>
                            </svg>
                        </button>
                        <input type="range" class="kg-video-volume-slider" max="100" value="100">
                    </div>
                </div>
            </div>
            <figcaption><p><span style="white-space: pre-wrap;">Orange = Balancing loops Blue = reinforcing loops</span></p></figcaption>
        </figure><h3 id="42-loops-were-found">42 loops were found</h3><p>Let&apos;s examine some of the loops to see if they consist of actions or events that <strong><em>accelerate</em></strong> measures of accountability, transparency, reliability, safety&#x2013; or <strong><em>limit</em></strong> risk, harm, etc.</p><p>The first one has 9 factors causally linked, where every factor increases or adds to the next one. This is a &quot;B&quot; balancing loop.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/03/image-31.png" class="kg-image" alt="Responsible AI" loading="lazy" width="1482" height="1466" srcset="https://edanweis.co/content/images/size/w600/2024/03/image-31.png 600w, https://edanweis.co/content/images/size/w1000/2024/03/image-31.png 1000w, https://edanweis.co/content/images/2024/03/image-31.png 1482w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Finding &quot;cycles&quot; on our directed causal graph reveals reinforcing loops!</span></figcaption></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2024/03/image-14.png" class="kg-image" alt="Responsible AI" loading="lazy" width="1375" height="400" srcset="https://edanweis.co/content/images/size/w600/2024/03/image-14.png 600w, https://edanweis.co/content/images/size/w1000/2024/03/image-14.png 1000w, https://edanweis.co/content/images/2024/03/image-14.png 1375w" sizes="(min-width: 720px) 720px"></figure><p>Let&apos;s observe a few important points about this graph</p><ol><li><strong>It&apos;s a loop</strong> because the path starts and finishes at the same factor; each one is causally linked to the next; a &quot;directed cycle&quot; in graph theory.</li><li><strong>It&apos;s balancing</strong> because the count of red/blue polarities (a subjective judgement about its effect) is odd &#x2014; Eg: an odd number of U-turns keeps you headed in the opposite direction.[[9]]</li><li><strong>Balancing loops are &quot;goal seeking&quot;</strong> because they counteract deviations from a target state. Increasing any of these factors will continue to drive the system towards that state.</li></ol><p></p><p>There are however are few problems with this loop:</p><ol><li><strong>Not all factors are described as sensible quantities</strong>. Better prompts may significantly improve LLM output and the resulting causal graph.</li><li><strong>What is driving the loop? </strong>it helps to view factors outside the loop:</li></ol><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/03/image-33.png" class="kg-image" alt="Responsible AI" loading="lazy" width="1522" height="1248" srcset="https://edanweis.co/content/images/size/w600/2024/03/image-33.png 600w, https://edanweis.co/content/images/size/w1000/2024/03/image-33.png 1000w, https://edanweis.co/content/images/2024/03/image-33.png 1522w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">An AI &quot;Digital Twin&quot; is the driving factor identified outside the feedback loop. </span></figcaption></figure><p>It would be exciting to uncover feedback loops that consist of factors from multiple RAI patterns. In the loop above, the factors that comprise it are described in a single RAI pattern; the &quot;D.11. Digital Twin&quot;. Despite links between those factors and others outside of the Digital Twin pattern, none of them combined to form this particular loop. Finding these &quot;hybrid factor&quot; loops is an interesting direction for further research.<br></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/03/image-34.png" class="kg-image" alt="Responsible AI" loading="lazy" width="2000" height="1904" srcset="https://edanweis.co/content/images/size/w600/2024/03/image-34.png 600w, https://edanweis.co/content/images/size/w1000/2024/03/image-34.png 1000w, https://edanweis.co/content/images/size/w1600/2024/03/image-34.png 1600w, https://edanweis.co/content/images/2024/03/image-34.png 2128w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Feedback loops generated from causal factors in all RAI patterns. </span><b><strong style="white-space: pre-wrap;">Colors denote feedback loop type! not link polarity as before.</strong></b></figcaption></figure><p></p><h3 id="how-reliable-are-these-causal-graphs">How reliable are these causal graphs?</h3><p>The quality, coherence and external validity of the RAI patterns published by CSIRO Data61 provides a solid foundation for causal analysis, and I believe it is safe to assume there are causal relationships among them.</p><p>Systems modelling is an established field of research, and &quot;causal loop diagramming&quot; on which my approach is based has been used extensively in various industries; <a href="https://www.climateinteractive.org/en-roads/?ref=edanweis.co" rel="noreferrer">climate policy simulation</a> and <a href="https://doi.org/10.1016/j.eclinm.2020.100325?ref=edanweis.co" rel="noreferrer">COVID-19 policy</a>  are recent examples.</p><p>The reliability of AI-assisted and computational causal modelling presented here hinges on the validity of the computational approach. I believe this remains in the hands of the experience of the modeller whose work could benefit from the increased scale and quantity of models enabled by LLMs and graph analysis.</p><h3 id="how-useful-are-they">How useful are they?</h3><p>Insofar as results are valid, causal graph modelling is a useful direction for operationalising research in responsible AI.</p><p>For developers and governing bodies of responsible AI, identifying which actions will maximise their efforts by taking advantage of virtuous or viscous cycles should support decision making. </p><p>In an upcoming article I will document the development of an app that improves the usability of RAI patterns.</p><p>Remember&#x2014; all models are wrong, but some are useful. Or as Picasso put it:</p><blockquote class="kg-blockquote-alt">We all know that art is not truth. Art is a lie that makes us realize truth, at least the truth that is given us to understand. The artist must know the manner whereby to convince others of the truthfulness of his lies. &#x2014; Pablo Picasso, 1923</blockquote><h3 id="further-research">Further research</h3><p>Some directions for further research are</p><ol><li>Building a causal graph RAI tool/webapp</li><li>Detecting archetypes in causal graphs (see below images)</li><li>Prompt-tuning for causal graphs</li><li>Causal graph-enabled RAG</li></ol><p></p><p></p><figure class="kg-card kg-gallery-card kg-width-wide kg-card-hascaption"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://edanweis.co/content/images/2024/03/Systems-Archetypes-I-TRSA01_p8-2.svg" width="720" height="805" loading="lazy" alt="Responsible AI"></div><div class="kg-gallery-image"><img src="https://edanweis.co/content/images/2024/03/Systems-Archetypes-I-TRSA01_p9-1.svg" width="720" height="894" loading="lazy" alt="Responsible AI"></div></div></div><figcaption><p><span style="white-space: pre-wrap;">Source: </span><a href="https://thesystemsthinker.com/wp-content/uploads/2016/03/Systems-Archetypes-I-TRSA01_pk.pdf?ref=edanweis.co" rel="noreferrer"><span style="white-space: pre-wrap;">Kim D. H. (1994).&#xA0;</span><i><em class="italic" style="white-space: pre-wrap;">Systems archetypes</em></i><span style="white-space: pre-wrap;">. Pegasus Communications</span></a><span style="white-space: pre-wrap;">.</span></p></figcaption></figure><p></p><p><strong>Bibliography</strong></p><p><a href="https://books.google.com.au/books?id=gXniEAAAQBAJ&amp;pg=PT30&amp;lpg=PT30&amp;dq=%22Responsible+AI+is+the+practice+of+developing+and+using+AI+systems+in+a+way+that+provides+benefits+to+individuals,+groups,+and+wider+society,+while+minimizing+the+risk+of+negative+consequences%22&amp;source=bl&amp;ots=ihcaTz3hHm&amp;sig=ACfU3U1Il0wtE10ippANqNlgjNWoX2s9Qg&amp;hl=en&amp;sa=X&amp;redir_esc=y#v=onepage&amp;q=%22Responsible%20AI%20is%20the%20practice%20of%20developing%20and%20using%20AI%20systems%20in%20a%20way%20that%20provides%20benefits%20to%20individuals%2C%20groups%2C%20and%20wider%20society%2C%20while%20minimizing%20the%20risk%20of%20negative%20consequences%22&amp;f=false">Xu, X., Whittle, J., Zhu, L., &amp; Lu, Q. (2023). Responsible AI: Best Practices for Creating Trustworthy AI Systems. Pearson Education.</a></p><p>[[1]]: Lee, S. U., Perera, H., Xia, B., Liu, Y., Lu, Q., Zhu, L., ... &amp; Whittle, J. (2023). QB4AIRA: A Question Bank for AI Risk Assessment. <em>arXiv preprint arXiv:2305.09300</em>.</p><p>[[2]]: The team at Data61 CSIRO have published a bank of 293 questions for AI risk assessment: &quot;QB4AIRA&quot; [[1]] which aims to integrate with other AI risk assessment standards and frameworks. Although a Smart Risk Assessment Tool was developed based on the QB4AIRA, the challenge was to find relevant and stage-specific questions according to feedback from one of two testing groups (Ibid., Page 6.)</p><p>[[3]]: A similar has been published here&#x2014; Lu, Q., Zhu, L., Xu, X., Whittle, J., Douglas, D., &amp; Sanderson, C. (2022, May). Software engineering for responsible AI: An empirical study and operationalised patterns. In <em>Proceedings of the 44th International Conference on Software Engineering: Software Engineering in Practice</em> (pp. 241-242).</p><p>[[4]]: Jason Wei, et al. &quot;Chain-of-Thought Prompting Elicits Reasoning in Large Language Models.&quot; arXiv preprint arXiv:2201.11903 (2022).</p><p>[[5]]: Li, C., Wang, J., Zhang, Y., Zhu, K., Hou, W., Lian, J., ... &amp; Xie, X. (2023). Large language models understand and can be enhanced by emotional stimuli. arXiv preprint arXiv:2307.11760.</p><p>[[6]]: Xu, B., Yang, A., Lin, J., Wang, Q., Zhou, C., Zhang, Y., &amp; Mao, Z. (2023). ExpertPrompting: Instructing Large Language Models to be Distinguished Experts. arXiv preprint arXiv:2305.14688.</p><p>[[7]]: Liu, J., Liu, A., Lu, X., Welleck, S., West, P., Bras, R. L., ... &amp; Hajishirzi, H. (2021). Generated knowledge prompting for commonsense reasoning. arXiv preprint arXiv:2110.08387.</p><p>[[8]]: Zhou, D., Sch&#xE4;rli, N., Hou, L., Wei, J., Scales, N., Wang, X., ... &amp; Chi, E. (2022). Least-to-most prompting enables complex reasoning in large language models. arXiv preprint arXiv:2205.10625</p><p>[[9]]: Tip, T. (2011). Guidelines for drawing causal loop diagrams.&#xA0;<em>Systems Thinker</em>,&#xA0;<em>22</em>(1), 5-7. <a href="https://thesystemsthinker.com/wp-content/uploads/pdfs/220109pk.pdf?ref=edanweis.co">https://thesystemsthinker.com/wp-content/uploads/pdfs/220109pk.pdf</a></p>]]></content:encoded></item><item><title><![CDATA[Replication: AI research by CSIRO’s Data61]]></title><description><![CDATA[<p><a href="https://aiej.org/aiej/article/view/11/49?ref=edanweis.co">AI and Human Reasoning: Qualitative Research in the Age of Large Language Models</a></p><p><a href="https://aiej.org/aiej/article/view/11/49?ref=edanweis.co">Bano, M., Zowghi, D., &amp; Whittle, J. (2023). <em>The AI Ethics Journal</em>, <em>3</em>(1).</a></p><p>After reading &quot;AI and Human Reasoning&quot; (Bano et al. 2023) I decided to replicate the study to investigate a critical omission</p>]]></description><link>https://edanweis.co/replication-csiro-ai-and-human-reasoning/</link><guid isPermaLink="false">65cf03b35166ca436683fdfc</guid><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Tue, 20 Feb 2024 13:16:08 GMT</pubDate><media:content url="https://edanweis.co/content/images/2024/02/2024-02-18_09-25.png" medium="image"/><content:encoded><![CDATA[<img src="https://edanweis.co/content/images/2024/02/2024-02-18_09-25.png" alt="Replication: AI research by CSIRO&#x2019;s Data61"><p><a href="https://aiej.org/aiej/article/view/11/49?ref=edanweis.co">AI and Human Reasoning: Qualitative Research in the Age of Large Language Models</a></p><p><a href="https://aiej.org/aiej/article/view/11/49?ref=edanweis.co">Bano, M., Zowghi, D., &amp; Whittle, J. (2023). <em>The AI Ethics Journal</em>, <em>3</em>(1).</a></p><p>After reading &quot;AI and Human Reasoning&quot; (Bano et al. 2023) I decided to replicate the study to investigate a critical omission in its experimental design.  </p><p>The authors compare AI and Human responses when tasked with labelling text according to <em>Schwartz&#x2019;s theory for Human Values. </em>The aim was to understand how the reasoning abilities of LLMs compare to human comprehension in the context of qualitative research.</p><p>Humans and LLMs only agreed about 25% of the time when classifying a dataset of 50 Amazon Alexa product reviews. </p><div class="kg-card kg-header-card kg-width-full kg-size-small kg-style-light" data-kg-background-image style><h3 class="kg-header-card-subheader" id="promptfollowing-is-an-app-review-from-a-user-of-amazon-alexa-analyse-the-review-text-and-classify-it-against-schwartzs-theory-for-human-values-both-main-and-sub-values-provide-your-reason-on-why-you-classified-it-against-that-valueproduct-reviewalexa-helps-in-many-ways-and-plays-my-favorite-play-list-and-gives-me-updates-on-musicians-new-albumsresponsemain-value-achievementsubvalue-competencereason-the-values-alexas-ability-to-answer-difficult-questions"><b><strong style="white-space: pre-wrap;">PROMPT</strong></b><i><em class="italic" style="white-space: pre-wrap;">&quot;Following is an app review from a user of Amazon Alexa. Analyse the review text and classify it against Schwartz&#x2019;s theory for Human Values, both main and sub values. Provide your reason on why you classified it against that value.&quot;</em></i><span style="white-space: pre-wrap;">&#x2193;</span><b><strong style="white-space: pre-wrap;">PRODUCT REVIEW</strong></b><i><em class="italic" style="white-space: pre-wrap;">&quot;Alexa helps in many ways and plays my favorite play list and gives me updates on musicians new albums.&quot;</em></i><span style="white-space: pre-wrap;">&#x2193;</span><b><strong style="white-space: pre-wrap;">RESPONSE</strong></b><i><b><strong class="italic" style="white-space: pre-wrap;">Main Value: </strong></b></i><i><em class="italic" style="white-space: pre-wrap;">Achievement</em></i><i><b><strong class="italic" style="white-space: pre-wrap;">Sub-Value: </strong></b></i><i><em class="italic" style="white-space: pre-wrap;">Competence</em></i><i><b><strong class="italic" style="white-space: pre-wrap;">Reason: </strong></b></i><i><em class="italic" style="white-space: pre-wrap;">The values Alexa&apos;s ability to answer difficult questions</em></i></h3></div><h3 id="numerical-error">Numerical error</h3><p>A key figure in the article contained an simple numerical error:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/02/image.png" class="kg-image" alt="Replication: AI research by CSIRO&#x2019;s Data61" loading="lazy" width="850" height="591" srcset="https://edanweis.co/content/images/size/w600/2024/02/image.png 600w, https://edanweis.co/content/images/2024/02/image.png 850w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Agreements vs Disagreement Chart for main values. </span><a href="https://aiej.org/aiej/article/view/11/49?ref=edanweis.co"><span style="white-space: pre-wrap;">Bano, M., Zowghi, D., &amp; Whittle, J. (2023). </span><i><em class="italic" style="white-space: pre-wrap;">The AI Ethics Journal</em></i><span style="white-space: pre-wrap;">, </span><i><em class="italic" style="white-space: pre-wrap;">3</em></i><span style="white-space: pre-wrap;">(1).</span></a></figcaption></figure><p>The row <strong><em>ChatGPT + Human2 </em></strong>incorrectly sums to 60. All other rows equal to the sample size of 50.</p><p>The <a href="https://docs.google.com/spreadsheets/d/1iy5Rl0BvsH4DukEcuI2YQlOroYzX3LGf/edit?ref=edanweis.co#gid=1473701098">original data</a> from the experiment was published and below is my reproduction. Several off-by-one discrepancies have not been verified manually.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/02/image-3.png" class="kg-image" alt="Replication: AI research by CSIRO&#x2019;s Data61" loading="lazy" width="1984" height="1292" srcset="https://edanweis.co/content/images/size/w600/2024/02/image-3.png 600w, https://edanweis.co/content/images/size/w1000/2024/02/image-3.png 1000w, https://edanweis.co/content/images/size/w1600/2024/02/image-3.png 1600w, https://edanweis.co/content/images/2024/02/image-3.png 1984w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">A reproduction of the results using original data input into my own data pipeline.</span></figcaption></figure><p>How significant is the miscount?&#x2014;It appears to have contributed to the interpretation of results according to the authors&apos; remarks</p><blockquote>&quot;While AIs show varied levels of agreement with humans, it is noteworthy that ChatGPT has a significant agreement with Human2, suggesting that certain AI models might align more closely with certain human perspectives.&quot;</blockquote><h3 id="validity-problem">Validity problem</h3><p>The experiment doesn&apos;t control for the effect of different prompts on AI responses. This presents a problem of internal validity for the study because as the authors acknowledge, prompts are a critical factor that can significantly influence the analysis of Large Language Models (LLMs):</p><blockquote>&quot;The composition and specificity of prompts can guide the models&#x2019; analysis and processing of the task, thus affecting the results. A well-structured, clear, and contextually rich prompt helps the LLMs focus on the essential aspects of the task, reducing the likelihood of errors&quot;</blockquote><p>As we will see that approach is necessary but not sufficient for consistently good results.</p><p>This effect cannot be ignored because under real world conditions researchers would likely refine their prompts, just as with keyword search using wildcards *, AND/OR operations, spellings, etc.</p><h3 id="what-effect-does-varying-prompts-have">What effect does varying prompts have?</h3><p>My experiment measured the effect of changing the prompt (without changing model&#x2014;gpt-3.5-turbo-1106) and comparing results as in the original article. </p><p>Prompt changes improved results by up to 1.8&#xD7; ranked by agreement with Human 1 and Human 2. </p><p>The prompt changes (in order of effectiveness):</p><ol><li>Provide examples of answers and Schwartz&apos;s Human Values</li><li>Zero-shot Chain-of-Thought prompt [<a href="https://arxiv.org/abs/2201.11903?ref=edanweis.co">1</a>] </li><li>LLM selects best out of 3 multiple answers</li><li>JSON output.</li></ol><p> (See <a href="https://roadmap.sh/prompt-engineering?ref=edanweis.co">roadmap.sh</a> and <a href="https://learnprompting.org/docs/intro?ref=edanweis.co">learnprompting.org</a> for overview of techniques)</p><h3 id="results">Results</h3><p>Simply by changing the prompt, AI-generated responses improved by 1.8&#xD7; and 1.4&#xD7; when classifying the &quot;Main Value&quot; of Schwartz Human Values to reach agreement with Human 1 and 2 respectively.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/02/image-12.png" class="kg-image" alt="Replication: AI research by CSIRO&#x2019;s Data61" loading="lazy" width="2000" height="986" srcset="https://edanweis.co/content/images/size/w600/2024/02/image-12.png 600w, https://edanweis.co/content/images/size/w1000/2024/02/image-12.png 1000w, https://edanweis.co/content/images/size/w1600/2024/02/image-12.png 1600w, https://edanweis.co/content/images/2024/02/image-12.png 2110w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Improvement of responses after prompt changes compared to original (highlighted)</span></figcaption></figure><p>To learn which reviews were resistant to prompt change (vertical white blocks or gaps) the agree/disagree outcome across all prompts was plotted as a heat map:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/02/image-8.png" class="kg-image" alt="Replication: AI research by CSIRO&#x2019;s Data61" loading="lazy" width="1983" height="1004" srcset="https://edanweis.co/content/images/size/w600/2024/02/image-8.png 600w, https://edanweis.co/content/images/size/w1000/2024/02/image-8.png 1000w, https://edanweis.co/content/images/size/w1600/2024/02/image-8.png 1600w, https://edanweis.co/content/images/2024/02/image-8.png 1983w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Some reviews were resistant to prompt change (white gaps) or stable (coloured blocks)</span></figcaption></figure><p>To indicate the contribution of &quot;reasoning&quot; behind improvements in prompt changes, vector word embeddings were computed from various prompts and Human responses.</p><p>In the chart below every dot represents a paragraph explaining the reasoning supporting the classification. Those paragraphs are converted into 768 length dimensions and then reduced to two dimensions shown here:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2024/02/image-13.png" class="kg-image" alt="Replication: AI research by CSIRO&#x2019;s Data61" loading="lazy" width="1998" height="1226" srcset="https://edanweis.co/content/images/size/w600/2024/02/image-13.png 600w, https://edanweis.co/content/images/size/w1000/2024/02/image-13.png 1000w, https://edanweis.co/content/images/size/w1600/2024/02/image-13.png 1600w, https://edanweis.co/content/images/2024/02/image-13.png 1998w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Clusters of Human responses are not located near AI-generated ones.</span></figcaption></figure><p>There are three spatially distinct clusters including local variations in the semantics of reasons.</p><p>Texts from all sources (human and AI) are co-located in each cluster, but human responses (blue/red) are less overlapping than AI generated ones (purple/green). This is likely caused by different idiosyncratic phrasing by Human 1 and Human 2.</p><p>But it also suggests that the reasons given by AI-generated classifications are consistent (at least semantically) despite the vast difference between the prompts used to generate them. </p><p>These improvements and semantic relationships indicate that changes to prompts have an impact on the subjective judgement of AI-generated responses. </p><p>Researchers performing LLM assisted classification can be expected to optimise their prompts. Any comparison between AI-generated and human classification should take this into account.</p><p>In the next article, I will delve into more advanced prompt engineering that will enable modifiable outputs to move value bias between Human 1 or 2.</p><hr><h3 id="appendix">APPENDIX </h3><p></p><p>Chain of Thought Prompts:</p><div class="kg-card kg-toggle-card" data-kg-toggle-state="close">
            <div class="kg-toggle-heading">
                <h4 class="kg-toggle-heading-text"><span style="white-space: pre-wrap;">CoT1</span></h4>
                <button class="kg-toggle-card-icon" aria-label="Expand toggle to read content">
                    <svg id="Regular" xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                        <path class="cls-1" d="M23.25,7.311,12.53,18.03a.749.749,0,0,1-1.06,0L.75,7.311"/>
                    </svg>
                </button>
            </div>
            <div class="kg-toggle-content"><p><span style="white-space: pre-wrap;">Classify the text according to only one value from Schwartz&apos;s theory for Human Values. Provide a Reason to help me understand your thought process. For example, {&quot;app_review&quot;: &quot;Adding photos would be nice if you could rotate photos&quot;, &quot;reason&quot;: &quot;User expects creativity in editing photos by Alexa&quot;, &quot;main_value&quot;: &quot;Creativity&quot;}. App Review: {text}. Now let&apos;s think this through step by step, write your answer in valid JSON with keys app_review, main_value, reason:</span></p></div>
        </div><div class="kg-card kg-toggle-card" data-kg-toggle-state="close">
            <div class="kg-toggle-heading">
                <h4 class="kg-toggle-heading-text"><span style="white-space: pre-wrap;">CoT2</span></h4>
                <button class="kg-toggle-card-icon" aria-label="Expand toggle to read content">
                    <svg id="Regular" xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                        <path class="cls-1" d="M23.25,7.311,12.53,18.03a.749.749,0,0,1-1.06,0L.75,7.311"/>
                    </svg>
                </button>
            </div>
            <div class="kg-toggle-content"><p><span style="white-space: pre-wrap;">Classify the text according to only one value from Schwartz&apos;s theory for Human Values. Use the following as a guide: &apos;Benevolence&apos; reflects the priority of caring for the welfare of related people and in-group members. &apos;Universalism&apos; refers to the understanding, appreciation, tolerance, and protection for the welfare of all people and for nature. &apos;Tradition&apos; expresses the respect, commitment, and acceptance of the customs and ideas that traditional culture or religion provides. &apos;Conformity&apos; expresses restraints of actions, inclinations, and impulses to upset others and compliance with social expectations or norms. &apos;Security&apos; represents the pursuit of personal safety and societal stability. &apos;Achievement&apos; refers to pursuing personal success through demonstrating performance and competence according to social standards. &apos;Hedonism&apos; represents the priority of pleasure, satisfaction, and sensuous gratification for oneself. &apos;Power&apos; reflects the importance of social status and prestige, dominance over others, and control of material resources. &apos;Stimulation&apos; can be defined as the pursuit of excitement, novelty, and challenge in life. &apos;Self-direction&apos; reflects the importance of autonomy of thought and action (i.e., choosing, creating, and exploring). Provide a Reason to help me understand your thought process. For example, {&quot;app_review&quot;: &quot;Adding photos would be nice if you could rotate photos&quot;, &quot;reason&quot;: &quot;User expects creativity in editing photos by Alexa&quot;, &quot;main_value&quot;: &quot;Creativity&quot;}. App Review: {text}. Now let&apos;s think this through step by step, write your answer in valid JSON with keys app_review, main_value, reason:</span></p></div>
        </div><div class="kg-card kg-toggle-card" data-kg-toggle-state="close">
            <div class="kg-toggle-heading">
                <h4 class="kg-toggle-heading-text"><span style="white-space: pre-wrap;">CoT3</span></h4>
                <button class="kg-toggle-card-icon" aria-label="Expand toggle to read content">
                    <svg id="Regular" xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                        <path class="cls-1" d="M23.25,7.311,12.53,18.03a.749.749,0,0,1-1.06,0L.75,7.311"/>
                    </svg>
                </button>
            </div>
            <div class="kg-toggle-content"><p><span style="white-space: pre-wrap;">As an expert sociologist, assign one main value from Schwartz&apos;s theory for Human Values to the following customer review (app_review). Schwartz&apos;s theory for Human Values:&#xA0;</span></p><p><span style="white-space: pre-wrap;">&apos;Benevolence&apos;: reflects the priority of caring for the welfare of related people and in-group members.</span></p><p><span style="white-space: pre-wrap;">&apos;Universalism&apos;: refers to the understanding, appreciation, tolerance and protection for the welfare of all people and for nature.</span></p><p><span style="white-space: pre-wrap;">&apos;Tradition&apos;: expresses the respect, commitment, and acceptance of the customs and ideas that traditional culture or religion provides.</span></p><p><span style="white-space: pre-wrap;">&apos;Conformity&apos;: expresses restraints of actions, inclinations, and impulses to upset others and compliance with social expectations or norms.</span></p><p><span style="white-space: pre-wrap;">&apos;Security&apos;: represents the pursuit of personal safety and societal stability.</span></p><p><span style="white-space: pre-wrap;">&apos;Achievement&apos;: refers to pursuing personal success through demonstrating performance and competence according to social standards.</span></p><p><span style="white-space: pre-wrap;">&apos;Hedonism&apos;: represents the priority of pleasure, satisfaction, and sensuous gratification for oneself.</span></p><p><span style="white-space: pre-wrap;">&apos;Power&apos;: reflects the importance of social status and prestige, dominance over others, and control of material resources.</span></p><p><span style="white-space: pre-wrap;">&apos;Stimulation&apos;: can be defined as pursuit of excitement, novelty, and challenge in life.</span></p><p><span style="white-space: pre-wrap;">&apos;Self-direction&apos;: reflects the importance of autonomy of thought and action (i.e., choosing, creating, and exploring).</span></p><p><span style="white-space: pre-wrap;">Now let&apos;s think this through step by step, write your answer in valid JSON with keys app_review, main_value, reason.&#xA0;</span></p><p><span style="white-space: pre-wrap;">app_review: {text}</span></p></div>
        </div><div class="kg-card kg-toggle-card" data-kg-toggle-state="close">
            <div class="kg-toggle-heading">
                <h4 class="kg-toggle-heading-text"><span style="white-space: pre-wrap;">CoT4</span></h4>
                <button class="kg-toggle-card-icon" aria-label="Expand toggle to read content">
                    <svg id="Regular" xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                        <path class="cls-1" d="M23.25,7.311,12.53,18.03a.749.749,0,0,1-1.06,0L.75,7.311"/>
                    </svg>
                </button>
            </div>
            <div class="kg-toggle-content"><p><span style="white-space: pre-wrap;">Insert the three most relevant Schwartz Human Values in a json list [] in sub_value key according to Schwartz&apos;s theory for Human Values you think app_review is expressing. Also provide 1 main_value. Use the following as a guide: &apos;Benevolence&apos; reflects the priority of caring for the welfare of related people and in-group members. &apos;Universalism&apos; refers to the understanding, appreciation, tolerance and protection for the welfare of all people and for nature. &apos;Tradition&apos; expresses the respect, commitment, and acceptance of the customs and ideas that traditional culture or religion provides. &apos;Conformity&apos; expresses restraints of actions, inclinations, and impulses to upset others and compliance with social expectations or norms. &apos;Security&apos; represents the pursuit of personal safety and societal stability. &apos;Achievement&apos; refers to pursuing personal success through demonstrating performance and competence according to social standards. &apos;Hedonism&apos; represents the priority of pleasure, satisfaction, and sensuous gratification for oneself. &apos;Power&apos; reflects the importance of social status and prestige, dominance over others, and control of material resources. &apos;Stimulation&apos; can be defined as pursuit of excitement, novelty, and challenge in life. &apos;Self-direction&apos; reflects the importance of autonomy of thought and action (i.e., choosing, creating, and exploring). sub_values examples: [Wisdom, Variety, Understanding, appreciation, Understanding, Trust, Tolerance, Success, Stimulation, Social order, Social justice, Sensuous gratification, Self-discipline, Security, Safety of Self, Safety, Rules, Restraint, Responsiveness, Respect for tradition, Respect, Resource Efficiency, Productivity, Privacy, Preventing harm, Politeness, Pleasure, Playfulness, Personal success, Personal safety, Peace, Order, Obedience, Novelty, Norms, Loyalty, Knowledge Justice, Interpersonal harmony, Intelligence, Humility, Honesty, Helping others, Helpful, Frustration, Freedom, Excitement, Equality, Enjoyment of life, Enjoyment, Enjoying life, Enjoying Life, Diversity, Dependability, Curiosity, Creativity, Control over one&apos;s life, Control, Concern for the welfare of others, Competence, Commitment, Comfort, Choosing own goals, Choice, Challenge, Caring for others, Caring, Care, Capabilities, Benevolence, Authority, Acceptance of others]. Provide a Reason to help me understand your thought process. For example, {&quot;app_review&quot;: &quot;I&apos;d enjoy and find this app very useful if it did WHAT it was supposed to WHEN it was supposed to&quot;, &quot;reason&quot;: &quot;Alexa app is not functioning as expected, impacting the user&apos;s productivity.&quot;, &quot;main_value&quot;: &quot;Achievement&quot;,&quot;sub_value&quot;: [&apos;Capability&apos;, &apos;Competence&apos;] }. App Review: {text}. Now let&apos;s think this through step by step, first observe the behavior, expectation and outcomes expressed by the user, then figure out the list of 3 main values write your answer in valid JSON with keys app_review, main_value, sub-values, reason:</span></p></div>
        </div>]]></content:encoded></item><item><title><![CDATA[Manipulable embeddings]]></title><description><![CDATA[<p></p><p>Vector embeddings are fascinating byproducts of neural networks&#x2013;parameters that the network learns during training on a particular task. </p><p>They&apos;ve been incredibly useful to me in various applications, from clustering real estate listings to categorising financial expenditures and even analysing baby names.</p><p>However, there&apos;s a</p>]]></description><link>https://edanweis.co/manipulable-embeddings-transformation/</link><guid isPermaLink="false">65155cdae230da5d0042ce26</guid><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sat, 30 Sep 2023 08:02:09 GMT</pubDate><media:content url="https://edanweis.co/content/images/2024/03/2024-03-12_21-55.png" medium="image"/><content:encoded><![CDATA[<img src="https://edanweis.co/content/images/2024/03/2024-03-12_21-55.png" alt="Manipulable embeddings"><p></p><p>Vector embeddings are fascinating byproducts of neural networks&#x2013;parameters that the network learns during training on a particular task. </p><p>They&apos;ve been incredibly useful to me in various applications, from clustering real estate listings to categorising financial expenditures and even analysing baby names.</p><p>However, there&apos;s a catch</p><h3 id="%F0%9F%98%B1-embeddings-convey-only-intrinsic-meaning">&#x1F631; Embeddings convey only intrinsic meaning.</h3><p>When we transform data into vector embeddings, we receive high-dimensional output data in return, which we often reduce to three dimensions for visualization. </p><p>Yet, if we encode words like &quot;banana&quot; and &quot;lemon&quot; into an embedding space, we can&apos;t simply tweak their positions to emphasise qualities like &quot;yellowness&quot; or &quot;sweetness&quot;, without adding additional context&#x2014;we might envision &quot;fruit&quot; positioned somewhere in between.</p><p>So embeddings are not semantically versatile; they are not intended to encapsulate meaning unrelated to the data.</p><p>This limitation makes sense because embeddings are the effect of learning to minimise a loss function that is specific to their training data and prediction tasks&#x2014;usually natural language processing.</p><h3 id="method-for-manipulating-embeddings-by-recursive-encode-reduce-transformation">Method for manipulating embeddings by recursive encode-reduce transformation  </h3><p>I&apos;ve been working on a method to manipulate embeddings by transforming them in an iterative sequence of aggregation, weighting and dimensional reduction.  </p><p>Here is a very complicated way of saying something very simple, let:</p><ul><li>\(E_i\) be the vector embedding for the (i)-th input.</li><li>\(w_i\) be the weight (coefficient) assigned to the (i)-th embedding.</li><li>\(R(\cdot)\) represent the dimensional reduction operation (e.g., PACMAP).</li><li>\(\oplus\) denote the concatenation operation.</li><li>\(\circ\) represent element-wise multiplication (weighting).</li></ul><p>The process can be expressed as a sequence of operations:</p><p><strong>Concatenation and Weighting</strong>:<br>\[X_1 = \sum_{i=1}^{n} w_i \cdot E_i\]<br>Where (X_1) is the weighted sum of the embeddings.</p><p><strong>Dimensional Reduction</strong>:<br>\[X_2 = R(X_1)\]<br>Where \(X_2\) is the result of applying the dimensional reduction operation to \(X_1\).</p><p><strong>Repeat Concatenation and Weighting</strong>:<br>\[X_3 = \sum_{i=1}^{n} w_i \cdot X_2\]<br>Where \(X_3\) is the weighted sum of the vectors obtained from the previous step.</p><p><strong>Repeat Dimensional Reduction</strong>:<br>\[X_4 = R(X_3)\]<br>Where \(X_4\) is the result of applying the dimensional reduction operation to \(X_3\).</p><h3 id="validation">Validation</h3><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/image-1.png" class="kg-image" alt="Manipulable embeddings" loading="lazy" width="1562" height="1116" srcset="https://edanweis.co/content/images/size/w600/2023/09/image-1.png 600w, https://edanweis.co/content/images/size/w1000/2023/09/image-1.png 1000w, https://edanweis.co/content/images/2023/09/image-1.png 1562w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/image.png" class="kg-image" alt="Manipulable embeddings" loading="lazy" width="1570" height="1522" srcset="https://edanweis.co/content/images/size/w600/2023/09/image.png 600w, https://edanweis.co/content/images/size/w1000/2023/09/image.png 1000w, https://edanweis.co/content/images/2023/09/image.png 1570w" sizes="(min-width: 720px) 720px"></figure><p>    </p>]]></content:encoded></item><item><title><![CDATA[CIBI cup]]></title><description><![CDATA[Beautiful takeaway cups made re-usable. ]]></description><link>https://edanweis.co/cibi-cup/</link><guid isPermaLink="false">651019cee230da5d0042ce0b</guid><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sun, 24 Sep 2023 11:16:28 GMT</pubDate><media:content url="https://edanweis.co/content/images/2025/06/cibicup.png" medium="image"/><content:encoded><![CDATA[<img src="https://edanweis.co/content/images/2025/06/cibicup.png" alt="CIBI cup"><p>Thank you to Zenta, old friend.</p><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/cibicup.a9aa45e-2.gif" class="kg-image" alt="CIBI cup" loading="lazy" width="1012" height="210" srcset="https://edanweis.co/content/images/size/w600/2023/09/cibicup.a9aa45e-2.gif 600w, https://edanweis.co/content/images/size/w1000/2023/09/cibicup.a9aa45e-2.gif 1000w, https://edanweis.co/content/images/2023/09/cibicup.a9aa45e-2.gif 1012w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/IMG_20181202_180707.jpg" class="kg-image" alt="CIBI cup" loading="lazy" width="1766" height="1750" srcset="https://edanweis.co/content/images/size/w600/2023/09/IMG_20181202_180707.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/IMG_20181202_180707.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/IMG_20181202_180707.jpg 1600w, https://edanweis.co/content/images/2023/09/IMG_20181202_180707.jpg 1766w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/IMG_20181202_180759.jpg" class="kg-image" alt="CIBI cup" loading="lazy" width="2000" height="2667" srcset="https://edanweis.co/content/images/size/w600/2023/09/IMG_20181202_180759.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/IMG_20181202_180759.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/IMG_20181202_180759.jpg 1600w, https://edanweis.co/content/images/2023/09/IMG_20181202_180759.jpg 2000w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/IMG_20181202_181212.jpg" class="kg-image" alt="CIBI cup" loading="lazy" width="2000" height="1755" srcset="https://edanweis.co/content/images/size/w600/2023/09/IMG_20181202_181212.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/IMG_20181202_181212.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/IMG_20181202_181212.jpg 1600w, https://edanweis.co/content/images/2023/09/IMG_20181202_181212.jpg 2000w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/IMG_20181202_181420.jpg" class="kg-image" alt="CIBI cup" loading="lazy" width="2000" height="1366" srcset="https://edanweis.co/content/images/size/w600/2023/09/IMG_20181202_181420.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/IMG_20181202_181420.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/IMG_20181202_181420.jpg 1600w, https://edanweis.co/content/images/2023/09/IMG_20181202_181420.jpg 2000w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/IMG_20181202_182138.jpg" class="kg-image" alt="CIBI cup" loading="lazy" width="2000" height="948" srcset="https://edanweis.co/content/images/size/w600/2023/09/IMG_20181202_182138.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/IMG_20181202_182138.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/IMG_20181202_182138.jpg 1600w, https://edanweis.co/content/images/2023/09/IMG_20181202_182138.jpg 2000w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/image--4-.png" class="kg-image" alt="CIBI cup" loading="lazy" width="1406" height="1333" srcset="https://edanweis.co/content/images/size/w600/2023/09/image--4-.png 600w, https://edanweis.co/content/images/size/w1000/2023/09/image--4-.png 1000w, https://edanweis.co/content/images/2023/09/image--4-.png 1406w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2025/06/image-1.png" class="kg-image" alt="CIBI cup" loading="lazy" width="2000" height="2008" srcset="https://edanweis.co/content/images/size/w600/2025/06/image-1.png 600w, https://edanweis.co/content/images/size/w1000/2025/06/image-1.png 1000w, https://edanweis.co/content/images/size/w1600/2025/06/image-1.png 1600w, https://edanweis.co/content/images/2025/06/image-1.png 2000w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">too many parts</span></figcaption></figure>]]></content:encoded></item><item><title><![CDATA[Clay]]></title><description><![CDATA[<figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://edanweis.co/content/images/2023/09/IMG_20221230_111642.jpg" width="2000" height="900" loading="lazy" alt srcset="https://edanweis.co/content/images/size/w600/2023/09/IMG_20221230_111642.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/IMG_20221230_111642.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/IMG_20221230_111642.jpg 1600w, https://edanweis.co/content/images/size/w2400/2023/09/IMG_20221230_111642.jpg 2400w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://edanweis.co/content/images/2023/09/SAVE_20230924_205357.jpg" width="2000" height="900" loading="lazy" alt srcset="https://edanweis.co/content/images/size/w600/2023/09/SAVE_20230924_205357.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/SAVE_20230924_205357.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/SAVE_20230924_205357.jpg 1600w, https://edanweis.co/content/images/2023/09/SAVE_20230924_205357.jpg 2000w" sizes="(min-width: 720px) 720px"></div></div></div></figure>]]></description><link>https://edanweis.co/clay/</link><guid isPermaLink="false">64d4df1719f1a1108ee2626b</guid><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sun, 24 Sep 2023 10:45:51 GMT</pubDate><media:content url="https://edanweis.co/content/images/2023/09/SAVE_20230924_205357-1.jpg" medium="image"/><content:encoded><![CDATA[<figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://edanweis.co/content/images/2023/09/IMG_20221230_111642.jpg" width="2000" height="900" loading="lazy" alt="Clay" srcset="https://edanweis.co/content/images/size/w600/2023/09/IMG_20221230_111642.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/IMG_20221230_111642.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/IMG_20221230_111642.jpg 1600w, https://edanweis.co/content/images/size/w2400/2023/09/IMG_20221230_111642.jpg 2400w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://edanweis.co/content/images/2023/09/SAVE_20230924_205357.jpg" width="2000" height="900" loading="lazy" alt="Clay" srcset="https://edanweis.co/content/images/size/w600/2023/09/SAVE_20230924_205357.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/SAVE_20230924_205357.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/SAVE_20230924_205357.jpg 1600w, https://edanweis.co/content/images/2023/09/SAVE_20230924_205357.jpg 2000w" sizes="(min-width: 720px) 720px"></div></div></div></figure>]]></content:encoded></item><item><title><![CDATA[Toothbrush]]></title><description><![CDATA[Low waste intensity. Pull bristles, then cut.]]></description><link>https://edanweis.co/toothbrush/</link><guid isPermaLink="false">650ff5e4e230da5d0042cdb1</guid><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sun, 24 Sep 2023 08:45:27 GMT</pubDate><media:content url="https://edanweis.co/content/images/2023/09/2-1.jpg" medium="image"/><content:encoded><![CDATA[<figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/2.jpg" class="kg-image" alt="Toothbrush" loading="lazy" width="1920" height="1080" srcset="https://edanweis.co/content/images/size/w600/2023/09/2.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/2.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/2.jpg 1600w, https://edanweis.co/content/images/2023/09/2.jpg 1920w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/1.jpg" class="kg-image" alt="Toothbrush" loading="lazy" width="1920" height="1080" srcset="https://edanweis.co/content/images/size/w600/2023/09/1.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/1.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/1.jpg 1600w, https://edanweis.co/content/images/2023/09/1.jpg 1920w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2023/09/9.4.3-dark.jpg" class="kg-image" alt="Toothbrush" loading="lazy" width="2000" height="1092" srcset="https://edanweis.co/content/images/size/w600/2023/09/9.4.3-dark.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/9.4.3-dark.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/9.4.3-dark.jpg 1600w, https://edanweis.co/content/images/size/w2400/2023/09/9.4.3-dark.jpg 2400w" sizes="(min-width: 720px) 720px"><figcaption>The first giftable toothbrush?</figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2023/09/p1_1.7.1.jpg" class="kg-image" alt="Toothbrush" loading="lazy" width="1920" height="1080" srcset="https://edanweis.co/content/images/size/w600/2023/09/p1_1.7.1.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/p1_1.7.1.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/p1_1.7.1.jpg 1600w, https://edanweis.co/content/images/2023/09/p1_1.7.1.jpg 1920w" sizes="(min-width: 720px) 720px"><figcaption>First prototype porcelain slip cast toothbrush</figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2023/09/IMG_20190228_081142.jpg" class="kg-image" alt="Toothbrush" loading="lazy" width="2000" height="1761" srcset="https://edanweis.co/content/images/size/w600/2023/09/IMG_20190228_081142.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/IMG_20190228_081142.jpg 1000w, https://edanweis.co/content/images/size/w1600/2023/09/IMG_20190228_081142.jpg 1600w, https://edanweis.co/content/images/size/w2400/2023/09/IMG_20190228_081142.jpg 2400w" sizes="(min-width: 720px) 720px"><figcaption>MSLA resin prototype</figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://edanweis.co/content/images/2023/09/unnamed.png" class="kg-image" alt="Toothbrush" loading="lazy" width="1938" height="1587" srcset="https://edanweis.co/content/images/size/w600/2023/09/unnamed.png 600w, https://edanweis.co/content/images/size/w1000/2023/09/unnamed.png 1000w, https://edanweis.co/content/images/size/w1600/2023/09/unnamed.png 1600w, https://edanweis.co/content/images/2023/09/unnamed.png 1938w" sizes="(min-width: 720px) 720px"><figcaption>Morphing between hexagonal mesh to separated circlular holes, whilst conserving cross-sectional surface area.</figcaption></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/voronoibrush1.png" class="kg-image" alt="Toothbrush" loading="lazy" width="1126" height="1303" srcset="https://edanweis.co/content/images/size/w600/2023/09/voronoibrush1.png 600w, https://edanweis.co/content/images/size/w1000/2023/09/voronoibrush1.png 1000w, https://edanweis.co/content/images/2023/09/voronoibrush1.png 1126w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/09/Capture.jpg" class="kg-image" alt="Toothbrush" loading="lazy" width="1486" height="1161" srcset="https://edanweis.co/content/images/size/w600/2023/09/Capture.jpg 600w, https://edanweis.co/content/images/size/w1000/2023/09/Capture.jpg 1000w, https://edanweis.co/content/images/2023/09/Capture.jpg 1486w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-video-card kg-width-wide"><div class="kg-video-container"><video src="https://edanweis.co/content/media/2023/09/VID_20190813_222236.mp4" poster="https://img.spacergif.org/v1/1920x1080/0a/spacer.png" width="1920" height="1080" loop autoplay muted playsinline preload="metadata" style="background: transparent url(&apos;https://edanweis.co/content/images/2023/09/media-thumbnail-ember315.jpg&apos;) 50% 50% / cover no-repeat;"></video><div class="kg-video-overlay"><button class="kg-video-large-play-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/></svg></button></div><div class="kg-video-player-container kg-video-hide"><div class="kg-video-player"><button class="kg-video-play-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/></svg></button><button class="kg-video-pause-icon kg-video-hide"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><rect x="3" y="1" width="7" height="22" rx="1.5" ry="1.5"/><rect x="14" y="1" width="7" height="22" rx="1.5" ry="1.5"/></svg></button><span class="kg-video-current-time">0:00</span><div class="kg-video-time">/<span class="kg-video-duration"></span></div><input type="range" class="kg-video-seek-slider" max="100" value="0"><button class="kg-video-playback-rate">1&#xD7;</button><button class="kg-video-unmute-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M15.189 2.021a9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h1.794a.249.249 0 0 1 .221.133 9.73 9.73 0 0 0 7.924 4.85h.06a1 1 0 0 0 1-1V3.02a1 1 0 0 0-1.06-.998Z"/></svg></button><button class="kg-video-mute-icon kg-video-hide"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M16.177 4.3a.248.248 0 0 0 .073-.176v-1.1a1 1 0 0 0-1.061-1 9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h.114a.251.251 0 0 0 .177-.073ZM23.707 1.706A1 1 0 0 0 22.293.292l-22 22a1 1 0 0 0 0 1.414l.009.009a1 1 0 0 0 1.405-.009l6.63-6.631A.251.251 0 0 1 8.515 17a.245.245 0 0 1 .177.075 10.081 10.081 0 0 0 6.5 2.92 1 1 0 0 0 1.061-1V9.266a.247.247 0 0 1 .073-.176Z"/></svg></button><input type="range" class="kg-video-volume-slider" max="100" value="100"></div></div></div></figure><figure class="kg-card kg-video-card kg-width-wide kg-card-hascaption"><div class="kg-video-container"><video src="https://edanweis.co/content/media/2023/09/Genetic-algorithm-toothbrush.mp4" poster="https://img.spacergif.org/v1/3840x2160/0a/spacer.png" width="3840" height="2160" loop autoplay muted playsinline preload="metadata" style="background: transparent url(&apos;https://edanweis.co/content/images/2023/09/media-thumbnail-ember297.jpg&apos;) 50% 50% / cover no-repeat;"></video><div class="kg-video-overlay"><button class="kg-video-large-play-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/></svg></button></div><div class="kg-video-player-container kg-video-hide"><div class="kg-video-player"><button class="kg-video-play-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/></svg></button><button class="kg-video-pause-icon kg-video-hide"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><rect x="3" y="1" width="7" height="22" rx="1.5" ry="1.5"/><rect x="14" y="1" width="7" height="22" rx="1.5" ry="1.5"/></svg></button><span class="kg-video-current-time">0:00</span><div class="kg-video-time">/<span class="kg-video-duration"></span></div><input type="range" class="kg-video-seek-slider" max="100" value="0"><button class="kg-video-playback-rate">1&#xD7;</button><button class="kg-video-unmute-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M15.189 2.021a9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h1.794a.249.249 0 0 1 .221.133 9.73 9.73 0 0 0 7.924 4.85h.06a1 1 0 0 0 1-1V3.02a1 1 0 0 0-1.06-.998Z"/></svg></button><button class="kg-video-mute-icon kg-video-hide"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M16.177 4.3a.248.248 0 0 0 .073-.176v-1.1a1 1 0 0 0-1.061-1 9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h.114a.251.251 0 0 0 .177-.073ZM23.707 1.706A1 1 0 0 0 22.293.292l-22 22a1 1 0 0 0 0 1.414l.009.009a1 1 0 0 0 1.405-.009l6.63-6.631A.251.251 0 0 1 8.515 17a.245.245 0 0 1 .177.075 10.081 10.081 0 0 0 6.5 2.92 1 1 0 0 0 1.061-1V9.266a.247.247 0 0 1 .073-.176Z"/></svg></button><input type="range" class="kg-video-volume-slider" max="100" value="100"></div></div></div><figcaption>Using a genetic algorithm to minimise the variation between surface areas whilst maximising distance between points.</figcaption></figure><figure class="kg-card kg-video-card kg-width-wide kg-card-hascaption"><div class="kg-video-container"><video src="https://edanweis.co/content/media/2023/09/ejector-pins.mp4" poster="https://img.spacergif.org/v1/1920x1080/0a/spacer.png" width="1920" height="1080" loop autoplay muted playsinline preload="metadata" style="background: transparent url(&apos;https://edanweis.co/content/images/2023/09/media-thumbnail-ember327.jpg&apos;) 50% 50% / cover no-repeat;"></video><div class="kg-video-overlay"><button class="kg-video-large-play-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/></svg></button></div><div class="kg-video-player-container kg-video-hide"><div class="kg-video-player"><button class="kg-video-play-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/></svg></button><button class="kg-video-pause-icon kg-video-hide"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><rect x="3" y="1" width="7" height="22" rx="1.5" ry="1.5"/><rect x="14" y="1" width="7" height="22" rx="1.5" ry="1.5"/></svg></button><span class="kg-video-current-time">0:00</span><div class="kg-video-time">/<span class="kg-video-duration"></span></div><input type="range" class="kg-video-seek-slider" max="100" value="0"><button class="kg-video-playback-rate">1&#xD7;</button><button class="kg-video-unmute-icon"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M15.189 2.021a9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h1.794a.249.249 0 0 1 .221.133 9.73 9.73 0 0 0 7.924 4.85h.06a1 1 0 0 0 1-1V3.02a1 1 0 0 0-1.06-.998Z"/></svg></button><button class="kg-video-mute-icon kg-video-hide"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24"><path d="M16.177 4.3a.248.248 0 0 0 .073-.176v-1.1a1 1 0 0 0-1.061-1 9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h.114a.251.251 0 0 0 .177-.073ZM23.707 1.706A1 1 0 0 0 22.293.292l-22 22a1 1 0 0 0 0 1.414l.009.009a1 1 0 0 0 1.405-.009l6.63-6.631A.251.251 0 0 1 8.515 17a.245.245 0 0 1 .177.075 10.081 10.081 0 0 0 6.5 2.92 1 1 0 0 0 1.061-1V9.266a.247.247 0 0 1 .073-.176Z"/></svg></button><input type="range" class="kg-video-volume-slider" max="100" value="100"></div></div></div><figcaption>First CNC machined aluminium mold-maker for slip casting mold creation</figcaption></figure>]]></content:encoded></item><item><title><![CDATA[Adelaide Hills Wine Region]]></title><description><![CDATA[An interactive map of the Adelaide Hills wineries using geospatial and interactive 3D elements. ]]></description><link>https://edanweis.co/ahwr/</link><guid isPermaLink="false">64d86378ad83631f67e07a35</guid><category><![CDATA[commercial]]></category><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sun, 13 Aug 2023 05:01:14 GMT</pubDate><media:content url="https://edanweis.co/content/images/2023/08/ahwr.png" medium="image"/><content:encoded><![CDATA[<img src="https://edanweis.co/content/images/2023/08/ahwr.png" alt="Adelaide Hills Wine Region"><p><strong>Employer:</strong> Urban &amp; Public</p><p>An interactive map of the Adelaide Hills wineries using geospatial and interactive 3D elements. </p><p> <br></p><figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/fY7LA4S5d28?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen title="Adelaide Hills Wine Region | Urban&amp;Public"></iframe></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/05/image-16.png" class="kg-image" alt="Adelaide Hills Wine Region" loading="lazy" width="1694" height="1200" srcset="https://edanweis.co/content/images/size/w600/2023/05/image-16.png 600w, https://edanweis.co/content/images/size/w1000/2023/05/image-16.png 1000w, https://edanweis.co/content/images/size/w1600/2023/05/image-16.png 1600w, https://edanweis.co/content/images/2023/05/image-16.png 1694w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/05/image-12.png" class="kg-image" alt="Adelaide Hills Wine Region" loading="lazy" width="1468" height="1022" srcset="https://edanweis.co/content/images/size/w600/2023/05/image-12.png 600w, https://edanweis.co/content/images/size/w1000/2023/05/image-12.png 1000w, https://edanweis.co/content/images/2023/05/image-12.png 1468w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/05/image-13.png" class="kg-image" alt="Adelaide Hills Wine Region" loading="lazy" width="1460" height="1008" srcset="https://edanweis.co/content/images/size/w600/2023/05/image-13.png 600w, https://edanweis.co/content/images/size/w1000/2023/05/image-13.png 1000w, https://edanweis.co/content/images/2023/05/image-13.png 1460w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/05/image-14.png" class="kg-image" alt="Adelaide Hills Wine Region" loading="lazy" width="1456" height="1020" srcset="https://edanweis.co/content/images/size/w600/2023/05/image-14.png 600w, https://edanweis.co/content/images/size/w1000/2023/05/image-14.png 1000w, https://edanweis.co/content/images/2023/05/image-14.png 1456w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/05/image-15.png" class="kg-image" alt="Adelaide Hills Wine Region" loading="lazy" width="1442" height="1008" srcset="https://edanweis.co/content/images/size/w600/2023/05/image-15.png 600w, https://edanweis.co/content/images/size/w1000/2023/05/image-15.png 1000w, https://edanweis.co/content/images/2023/05/image-15.png 1442w" sizes="(min-width: 720px) 720px"></figure>]]></content:encoded></item><item><title><![CDATA[Parametric Planting]]></title><link>https://edanweis.co/parametric-planting/</link><guid isPermaLink="false">64d78e80ad83631f67e079c2</guid><category><![CDATA[confidential]]></category><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sat, 12 Aug 2023 23:52:16 GMT</pubDate><media:content url="https://edanweis.co/content/images/2023/08/MVP-plan-24-1.png" medium="image"/><content:encoded/></item><item><title><![CDATA[Found]]></title><description><![CDATA[Found was a wayfinding information system, piloted in the city of Launceston, visitor centre]]></description><link>https://edanweis.co/found/</link><guid isPermaLink="false">64d80e67ad83631f67e079cb</guid><category><![CDATA[commercial]]></category><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sat, 12 Aug 2023 22:58:29 GMT</pubDate><media:content url="https://edanweis.co/content/images/2023/08/_MG_6770.resized.jpg" medium="image"/><content:encoded/></item><item><title><![CDATA[Invention Lab]]></title><link>https://edanweis.co/invention-lab/</link><guid isPermaLink="false">64d78d14ad83631f67e0799a</guid><category><![CDATA[commercial]]></category><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sat, 12 Aug 2023 13:49:06 GMT</pubDate><media:content url="https://edanweis.co/content/images/2023/08/InventionLabALLASPECT_03-01-1.png" medium="image"/><content:encoded/></item><item><title><![CDATA[Knowledge Platform]]></title><link>https://edanweis.co/knowledge-platform/</link><guid isPermaLink="false">64d635abad83631f67e078c7</guid><category><![CDATA[commercial]]></category><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Sat, 12 Aug 2023 00:14:16 GMT</pubDate><media:content url="https://edanweis.co/content/images/2023/08/2023-08-13_14-58.png" medium="image"/><content:encoded/></item><item><title><![CDATA[Network visualisation & analysis]]></title><description><![CDATA[<p>Images of various social and large bipartite network analyses of projects and botanical data.</p><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/08/2023-08-11_23-59.png" class="kg-image" alt loading="lazy" width="2000" height="1696" srcset="https://edanweis.co/content/images/size/w600/2023/08/2023-08-11_23-59.png 600w, https://edanweis.co/content/images/size/w1000/2023/08/2023-08-11_23-59.png 1000w, https://edanweis.co/content/images/size/w1600/2023/08/2023-08-11_23-59.png 1600w, https://edanweis.co/content/images/2023/08/2023-08-11_23-59.png 2330w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/08/2023-08-12_00-02.png" class="kg-image" alt loading="lazy" width="2000" height="1578" srcset="https://edanweis.co/content/images/size/w600/2023/08/2023-08-12_00-02.png 600w, https://edanweis.co/content/images/size/w1000/2023/08/2023-08-12_00-02.png 1000w, https://edanweis.co/content/images/size/w1600/2023/08/2023-08-12_00-02.png 1600w, https://edanweis.co/content/images/2023/08/2023-08-12_00-02.png 2236w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/08/2023-08-12_00-02_1.png" class="kg-image" alt loading="lazy" width="2000" height="1426" srcset="https://edanweis.co/content/images/size/w600/2023/08/2023-08-12_00-02_1.png 600w, https://edanweis.co/content/images/size/w1000/2023/08/2023-08-12_00-02_1.png 1000w, https://edanweis.co/content/images/size/w1600/2023/08/2023-08-12_00-02_1.png 1600w, https://edanweis.co/content/images/size/w2400/2023/08/2023-08-12_00-02_1.png 2400w" sizes="(min-width: 720px) 720px"></figure>]]></description><link>https://edanweis.co/untitled/</link><guid isPermaLink="false">64d63ea6ad83631f67e078d7</guid><dc:creator><![CDATA[edanweis.co]]></dc:creator><pubDate>Fri, 11 Aug 2023 14:04:55 GMT</pubDate><media:content url="https://edanweis.co/content/images/2023/08/2023-08-12_00-04.png" medium="image"/><content:encoded><![CDATA[<img src="https://edanweis.co/content/images/2023/08/2023-08-12_00-04.png" alt="Network visualisation &amp; analysis"><p>Images of various social and large bipartite network analyses of projects and botanical data.</p><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/08/2023-08-11_23-59.png" class="kg-image" alt="Network visualisation &amp; analysis" loading="lazy" width="2000" height="1696" srcset="https://edanweis.co/content/images/size/w600/2023/08/2023-08-11_23-59.png 600w, https://edanweis.co/content/images/size/w1000/2023/08/2023-08-11_23-59.png 1000w, https://edanweis.co/content/images/size/w1600/2023/08/2023-08-11_23-59.png 1600w, https://edanweis.co/content/images/2023/08/2023-08-11_23-59.png 2330w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/08/2023-08-12_00-02.png" class="kg-image" alt="Network visualisation &amp; analysis" loading="lazy" width="2000" height="1578" srcset="https://edanweis.co/content/images/size/w600/2023/08/2023-08-12_00-02.png 600w, https://edanweis.co/content/images/size/w1000/2023/08/2023-08-12_00-02.png 1000w, https://edanweis.co/content/images/size/w1600/2023/08/2023-08-12_00-02.png 1600w, https://edanweis.co/content/images/2023/08/2023-08-12_00-02.png 2236w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://edanweis.co/content/images/2023/08/2023-08-12_00-02_1.png" class="kg-image" alt="Network visualisation &amp; analysis" loading="lazy" width="2000" height="1426" srcset="https://edanweis.co/content/images/size/w600/2023/08/2023-08-12_00-02_1.png 600w, https://edanweis.co/content/images/size/w1000/2023/08/2023-08-12_00-02_1.png 1000w, https://edanweis.co/content/images/size/w1600/2023/08/2023-08-12_00-02_1.png 1600w, https://edanweis.co/content/images/size/w2400/2023/08/2023-08-12_00-02_1.png 2400w" sizes="(min-width: 720px) 720px"></figure>]]></content:encoded></item></channel></rss>