Quick Links to Topics on this Page
MiniGPT-4 (powerful image reader)
Perplexity AI - power of GPT-4 with verifiable outputs
GAI Framework for Consultation
The document context and overview statement is "The Australian Framework for Generative Artificial Intelligence in Schools (the Framework) seeks to guide the responsible and ethical use of generative AI tools in ways that benefit students, schools, and society."
Please take a moment to review this important document 16 page document. It is easy-to-read and the 6 core elements and 22 principles within the proposed framework are simply explained.
New AI to Try handout
3.8.23 - We haven't posted on this page in a while but things have continued to charge ahead in the AI space.
See this updated AI to Try handout which gives an overview of some well-known plus not so well known AI tools.
Prompt frameworks you can use and adapt are also included: AI to Try 230731.pdf
Some other really useful sites for teachers and educators:
- An AI-powered platform designed by teachers for teachers, using GPT-3 technology. The platform offers various generators, including lesson objectives, multiple-choice quizzes, scheme of work, class questions, rubrics, and glossaries, making it easier for teachers to create high-quality resources.
- Try for free before you buy offer
- AI-powered tools for teachers and educators to enhance their teaching and planning.
- Includes features like drafting and managing lesson plans, multiple-choice assessments, reflections, rubric-driven assessments, syllabus documentation, and course structures.
- Free and paid plans
3.8.23 - updated clip generated. We wanted to see if there had been any improvement in the model in the two months since we tried it last. What do you think?
13.6.23 - Open-source AI built with Facebook's LLaMA model can turn your simple text prompts into music. With the three words 'funky house music' the 12 second clip below was generated
The AI was trained on 20k hours of licensed music, 10k music tracks and data from Shutterstock.
Anyone can have a go at creating a unique sound clip here.
Try with a simple prompt like ours up to something more directive such as: "a [type of music] track with [type of instrument] at [#] BPM in the style of [musician’s] [specific melody/song]" e.g. a rock 'n' roll track with a bass guitar at 85bpm in the style of Elvis Presley's Polk Salad Annie
8.6.23 - Runway is a program that will generate video from text prompts (among so many more functions!)
Here's an example of what it produced in about 20 seconds using a fairly unsophisticated prompt of "adult learner sitting in a peaceful spot searching for information on the computer"
While there were some hallucinations (see the guy's fingers), for us, the key takeaways are:
- We gave a vague description of what was in our heads. The 'peaceful location' could have been in a garden, on the beach wherever, but the context generated was appropriately 'peaceful' in that there are no visible distractions for the learner
- The prompt was unsophisticated in that there were no aspect ratios, lens filters, artist styles etc applied - just plain and simple text
- Combined, this AI gave us a pretty good clip - which means the user experience is there: quick, simple, easy to use and can produce viable outputs
Having said that, check out what happened for another experimenter:
AI slides (and not) on the history of AI
30.5.23 - We were inspired to experiment this afternoon by a post we saw on LinkedIn. We copied it and expanded the test across ChatGPT, Bard, Bing Bot and Perplexity.
Here's the results from this prompt: I want you to write me a VBA code for a PowerPoint presentation about the history of AI. you are to fill in all the text with your own knowledge. No placeholders. I need five slides
History of Artificial Intelligence by Bard.pdf
History of Artificial Intelligence by Bing Bot.pdf
History of Artificial Intelligence by Perplexity.pdf
History of Artificial Intelligence by ChatGPT.pdf
The video clip shows the results out of ChatGPT and then into PowerPower 365 and with more AI help as slide suggestions from 'Designer'. No real editing has been applied.
Only ChatGPT was able to produce code that worked and only ChatGPT gave user instructions without prompting on how to open the VBA screen in PPT.
Bard by Google
Bard has been made available to users in 180 countries. As of yesterday morning 12.5.23, 4:15am we noticed the inclusion via our Google search page. You need to be logged in to your Google account to access Bard via this link: https://bard.google.com/
This AI is built on PaLM2 - a different large language model to ChatGPT (and what powers Bing Bot). Here's our test run video of some of what Bard did for us. In other tests yesterday, it summarised PDFs from the internet (made errors), reviewed content on web pages and offered conversational advice to various questions.
The outputs are presented differently (and very quickly!) but common themes of errors and in some cases, incorrect context, remain.
Note that the 'Reset chat' function will clear the conversation but no history of the chats are saved - just the prompts used previously
When asking Bard to review an image, we got some serious hallucinations. This was the image and link:
and this was what Bard 'saw'
(Perfect example of why we can't trust outputs [yet])
Read more about what Google intends to release shortly
And this article which discusses the statement from Google's CEO that "all of Google’s AI models would include the use of watermarking and metadata to allow people to know that AI-generated content is exactly that". We see this as a good thing.
Have you tried...?
Generated with very simple prompts in the Canva text-to-image app 30.4.23
|Prompt: Futuristic AI-bot using tools; style = dreamlike||Prompt: same as left; style = retro anime||Prompt: Tibetan spaniel puppy excited at the door; style = photo|
Midjourney - you've probably heard about it, here's a great article with user tips
Points of interest
- AI to Z: all the terms you need to know to keep up in the AI hype age (article)
ChatGPT and Artificial Intelligence in higher education: Quick start guide
- How to cite ChatGPT
The immediate concern with ChatGPT in education was plagiarism - but ethical considerations go beyond that. Issues include privacy, bias, racism, and authenticity.
AI Is a Lot of Work As the technology becomes ubiquitous, a vast tasker underclass is emerging — and not going anywhere (article)
- 'The Godfather of AI' leaves Google and warns of danger ahead (article)
- Arrested because of a racist algorithm (article)
- Key US agencies issue a joint statement about enforcement efforts to protect the public from bias in automated systems and artificial intelligence
- 'Incognito' mode for ChatGPT
- OpenAI explains new feature to turn off chat history in ChatPGT
- Voice Actors ‘Shocked’ to Find AI Copying Their Speech, Tone (article)
- Quebec man who created synthetic, AI-generated child pornography sentenced to prison (article)
Illegal trade in AI child sex abuse images exposed - 29.6.23
- Here's a prime example of how the developmental speed of AI is far exceeding its regulation speed.
- How do we know what is real and what is not? Voices, images, news etc (see the section on deepfakes)
Ethical considerations are not just the use and privacy implications - ChatGPT Is Powered by Human Contractors Getting Paid $15 Per Hour
If ChatGPT wrote it, who owns the copyright? It depends on where you live, but in Australia it’s complicated
- Article from The Conversation
- US Copyright Office starts to offer guidance
- US Copyright Registration Guidance: Works Containing Material Generated by Artificial Intelligence
- AI and the Law: What You Need To Know (article)
Impacts on Education
ChatGPT (We need to talk) (Q&A transcript - University of Cambridge)
Don't pay to learn 'prompting' (prompt craft; prompt engineering; prompting = giving AI instructions)
Associate Profession Ethan Mollick (from Wharton School of the University of Pennsylvania in the US) gave a talk about teaching in a new world at the 2023 ASU+GSV Summit, 18.4.23.
In his talk he says "...just experiment by the way - there's no magic in prompt-crafting, use the AI enough and you will figure out how to make the prompts; anyone trying to sell you courses... don't worry about it.. it's figure-out-able..."
He shares some interesting viewpoints about emerging issues and opportunities:
📢 What is plagiarism now anyway?
📢 Thoughtful use of AI in tasks (and the role of reflection in learning - we know about this, but how often is it built into our courses?)
📢 Increasing standards and expectations from students - and how we can operationalise this through action learning and project-based learning
💜 Better focus on learning science (of course we love this one as it's the premise of Education Matters)
📢 Ways to personalise learning in a way that is not just an individual experience, it's an experience *for* the individual and at a level they need
📢 Case study discussion of how GPT4 is doing comparable work in seconds that normally takes high-level experts weeks/months to develop
"This is the worst AI you'll ever use" 🔺 It will only get better - it will get better at facts
Key message >> this incredible technology will change education. As educators, don't wait, just get into it! Give it a go and experiment.
Since we first discovered ChatGPT we've been saying that this is a tool that will re-focus the time we spend as educators. Yes, the AI will save time with a lot of tasks, but we'll be able to 'bank' that time for more human-centred interactions. We also acknowledge that issues such as ethical use of the AI exist and need to be considered.
The future is unknown at this point!
Take 20 mins for this clip (less at 1.5 speed 😉)
Plus, here's a journal article from March 2023: Using AI to Implement Effective Teaching Strategies in Classrooms: Five Strategies, Including Prompts
👉 Download this info sheet from us on prompting:
Let ChatGPT Be Your Teaching Assistant: Strategies for Thoughtfully Using AI to Lighten Your Workload
- Article in Harvard Business Publishing Education
Summary of the article by Bing Bot (trying out some of the tips in the download provided above - prompt was "summarize the article at: https://hbsp.harvard.edu/inspiring-minds/let-chatgpt-be-your-teaching-assistant"):
The article discusses how teachers can use large language models (LLMs) like ChatGPT to help them with various pedagogical tasks, such as creating quizzes, generating feedback, and facilitating discussions.
LLMs are AI systems that can generate text based on prompts, but they may also make errors or lie, so they need to be used carefully and checked by humans.
The article provides an overview of six LLMs that are available for educators, and gives examples of how they can be used in different scenarios.
The article also warns about the ethical and social implications of using LLMs, such as plagiarism, bias, and privacy.
The article concludes by suggesting some best practices for using LLMs as teaching assistants, such as being transparent, creative, and critical.
MiniGPT-4 is a powerful AI that can'read' images you upload and tell you about them. It can also generate websites from handwritten text!
Here's an example from the project page:
❔ Can you image the types of applications something like the above could be useful for?
❔ Can you image use cases for this tool paired with text-to-speech capabilities?
❔ What about assistance with accessibility features?
❔ What kind of information literacy activities could you set with students to boost critical thinking and analysis?
This is what the AI had to say about our own picture. (We had some rain here this week and on the way to school, saw this magnificent rainbow low in the sky and stretching across the oval).
See our first experiment with MiniGPT-4:
👉 The link to the journal article we were using in our experiment (Construction of personalized learning pathways through mixed methods, 2020)
👉 Read about MiniGPT-4 on its GitHub project page
👉 Try MiniGPT-4
Perplexity AI 23.4.23
Another AI option producing results in a flash - Perplexity AI. It is powered by GPT-4, appears like ChatGPT and gives results noting the source of the search (like Bing Bot).
It also allows a 'share' which means, unlike ChatGPT and Bing Bot, others can see the outputs of the conversation!
Here are the links from the outputs generated in our test-drive of Perplexity:
💡 Response to 'what is pre-use validation?'
💡 Response to 'outline adult learning principles and how they apply in the context of Australian VET' (from the web)
💡 Response to 'outline adult learning principles and how they apply in the context of Australian VET' (from academic option)
💡 You can click 'view detailed' to see more ('view detailed' looks like the 'left align' icon and is the one next to the thumbs down underneath the output)
We like this AI because you can choose the sources of the information, you can share the results (making the conversation verifiable), you can use the 'quick response' for free - no sign up required
Check out Perplexity:
Sign up for the Android waitlist: https://www.perplexity.ai/android
Get Perplexity from the App Store: https://apps.apple.com/us/app/perplexity-ask-anything/id1668000334
An AI 'agent' is a task-driven autonomous AI that will create a task list, work on that list, learn, self-reflect and refine the list, execute the tasks and prioritise those in real time.
All you have to do it give it a 'goal' and it will go on and identify the tasks to complete that goal, execute those and identify others that will add to the goal - and you do not need to add additional prompts.
What should the agent work toward? Do you want it to create a learning program? We asked it to create a one-hour session on assessment issues in Australian Vocational Education and Training (VET). We said that the intended audience will be VET practitioners with varying levels of experience - AgentSkillsEducation did the rest 👇👇
Download the outputs from the goal tasks here:
We went back for seconds - this time to ask for a training program on common issues in assessment tool development for VET - with a focus on pre-use validation:
Download the outputs from our goal to have a learning program that identifies common issues in VET assessment tool development with a focus on pre-use validation:
Proceed with caution 31.3.23
One of the things being suggested as a use case for AI is to get it to assist with 'grunt' work - like summarising longer bodies of text.
Testing it again, I asked Bing Bot to summarise the paper at this link: https://www.jobsandskills.gov.au/sites/default/files/2023-03/jsa_submission_-_perceptions_and_status_of_vet_0.pdf
This is the JSA's submission to the Inquiry into the Perceptions and Status of Vocational Education and Training. The submission was made 8.3.23 and the JSA appeared before the House Standing Committee on 22.3.23.
This is what I got (outcomes of analysis highlighted FYI):
And the link referenced in the reply - although from the JSA, totally different link and document to the one AI was asked to summarise!
Despite having been given the Inquiry doc, Bing Bot gave me back a totally different URL for where I could go "for more information". (You'd like it'd be circular, right? Back to the original doc.)
Further, on searching the Inquiry Submission, 'AQF' is mentioned twice AND the whole document including cover page is only 12 pages - not 48 pages.
We're getting some seriously incorrect information spewed out about the VET sector. The key words are there in the response - just enough that if a time-poor person (who was asking for a summary in the first place) didn't have time to properly read and vet (excuse the pun) the output, it could be taken as being a nice little summary.
We've been saying for a while now that GAI reinforces the need for critical thinking and analysis skills. So here's some tips to try.
Tips for developing critical analysis skills:
1) Think like a detective - look at the clues and figure out what is important and what is not. Hone in on the details
2) Ask lots of questions - does this make sense? Is it true? Why do I believe that? Why should I believe that??
3) Practice checking your sources - where is this information coming from? Is it credible? Are multiple sources saying the same/similar things?
4) Check for facts - are statements made supported with evidence? And is the evidence the same/similar across multiple credible sources? Are there small errors in the information? (This could be a tell-tale sign of more/bigger ones to come)
On the up side, the number of conversations in a Bing Chat is now up to 20 (from 15 yesterday). That might be the trade off for ads now appearing in the conversation threads.
Oh, and it's April Fool's Day tomorrow. ..........
- M. Charlton
My fight with Bing Bot 25.3.23
So this happened today... I went in to do a Bing Bot 101 type video and Bing Bot and I had a bit of a fight. (Video below).
I asked it to show me information on a unit of competency and it gave me incorrect responses. And five times it asserted it was right and I was wrong. I even kept checking, doubting myself "am I sure I've read the info on training.gov correctly? Am I looking at the right unit? Is it something else I'm missing???"
But no, it was simply that Bing Bot was pulling incorrect information from somewhere and assertively telling me to check my facts because it was right.
The clip shows the reasons for why we need to:
- Build information literacy in students
- Encourage critical enquiry
- Support the development of critical thinking and analysis skills
Embed skills to research and verify outputs from generative AI (GAI) in how we educate
And, it illustrates why (for now) this tool is simply a prompt for technical writing used in compliance-based environments such as Australian Vocational Education and Training (VET). Assessment designers should not be blindly trusting GAI outputs and/or taking a blanket approach to copying outputs for use in learning and assessment materials.
This morning's episode makes me consider further the question of upskilling (us and our students) in areas of critical thought. I knew the AI was wrong and I knew where to go for source information. But what if I was new to the topic and/or did not know how/where to check the information?
For example, this morning Bing Bot also told me the dimensions of competency included:
❌ The Principles of Assessment
❌ Rules of Evidence
❌ Core skills
Given the AI is trained on content generated by humans in the first place, it is fallible. Because humans are.
Pulling information from the internet will result in more recent information, not necessarily more accurate. Especially since anyone can post anything with Web 2.0 tech. Even with crowd-sourced editing and verification of content, there's still scope for the spread of biased- / mis- / dis- / flat-out-incorrect- information
FYI - I went back as far as TGA would allow me in terms of seeing unit content. Even the unit FPISAW3218B - Replace tungsten tips (Release 1) released 20.5.2011 has only three elements!!
- M. Charlton
Using Bing Bot to assist with VET assessment tool development 25.3.23
Following on from my earlier video where Bing Bot and I got into a bit of an argument over how many elements a unit of competency has, here's a video that went more to plan. (Video below)
The plan being: to show how Bing Bot can be used as a tool to generate ideas for VET assessment tool development.
I deliberately picked a unit of competency where not much 'general' information exists like there might be in a BSB unit on leadership and management skills or a SIR unit on customer service skills.
I gave Bing Bot the:
Context - "You are an instructional designer writing assessment tools for vocational education and training in Australia"
This specifies how I'm expecting the results to be framed
Task within the context - "You must write about saw doctoring, specifically the unit FWPSAW3247 - Replace tungsten tips"
This primes the AI to understand why I'm asking the question
Action - "Research this"
This tells the AI what to do
Then I told it to get information on the dimensions of competency so it would understand my next instruction. Which was:
"Give me ideas for competency based assessment tasks for this unit that incorporate the dimensions of competency"
Within a conversation, the AI (Bing Bot or ChatGPT) will remember the context of what's already been discussed which means you don't have to keep prompting it on the same thing.
Bing Bot came back with this output as ideas:
"Some possible ideas for competency based assessment tasks for the unit FWPSAW3247 - Replace tungsten tips that incorporate the dimensions of competency are:
- A practical task where the learner has to replace tungsten tips on two different circular saw blades according to workplace procedures and manufacturer instructions. The learner has to demonstrate task skills by performing the task to the required standard, task management skills by selecting and using appropriate tools and equipment, contingency management skills by identifying and resolving any faults or defects, and job/role environment skills by following health and safety and environmental protection practices.
- A written or oral test where the learner has to answer questions about the principles and techniques of replacing tungsten tips, such as the types and characteristics of tungsten tips, the brazing material and technique, the tip position and side clearance, and the quality standards and objectives. The learner has to demonstrate task skills by applying their knowledge to the task, task management skills by organising their answers logically and clearly, contingency management skills by explaining how they would deal with irregularities or breakdowns in routine, and job/role environment skills by referring to relevant legislation, regulations and standards.
- A portfolio of evidence where the learner has to provide documentation related to replacing tungsten tips, such as work orders, risk assessments, waste disposal records, quality checklists, and feedback reports. The learner has to demonstrate task skills by producing accurate and complete documentation, task management skills by integrating documentation with work processes, contingency management skills by showing how they responded to feedback or complaints, and job/role environment skills by complying with workplace policies and procedures."
As an assessment developer, you should now review the output to determine:
- It's suitability
- Whether a task based on these ideas will collect sufficient, valid evidence
- How you can build out any viable ideas into actual assessment instruments
I'd also recommend confirming with an SME (subject matter expert) whether the ideas are sound. Then you can build out the tool with its tasks and benchmarks, and again, have the SME review to confirm accuracy and relevance to industry requirements. And of course, always get the tool validated pre-use to confirm all unit requirements are covered and the tool is fit-for-purpose.
Some notes on the above:
In idea 1, Bing Bot came back with a suggestion for a practical task where ".. the learner has to replace tungsten tips on two different circular saw blades according to workplace procedures and manufacturer instructions..."
This is great because although the unit does not specify 'two', it does use plural 'blades' and the definition of competency in the Standards for RTOs 2015 states that competency is "...the consistent application of knowledge and skill to the standard of performance required in the workplace. It embodies the ability to transfer and apply skills and knowledge to new situations and environments." 'Consistent' meaning "acting or done in the same way over time, especially so as to be fair or accurate" which means more than once.
It also says this task should be done "according to workplace procedures and manufacturer instructions" - which again, is great as this means the task must align to components in the performance evidence and underpinning knowledge evidence requirements.
In idea 2, Bing Bot suggested a written OR oral test - great considering a) reasonable adjustments that might apply in certain cases, b) oral questioning can circumvent academic integrity issues linked to use of AI for completing written tests
In idea 3, a portfolio of evidence covers performance over time and in different circumstances - which is aligned to the actual definition of competency
In all, there are some good starting points suggested and sound rationale provided for how the suggested tasks go on to incorporate the dimensions of competency.
Dimensions of competency
Remember how I asked Bing Bot to research the dimensions of competency? Well, in each idea suggestion, Bing Bot gives an explanation of how the assessment task idea incorporates the dimensions of competency.
In VET, dimensions of competency are implicit in the expected outcomes of a unit. They allow for the following aspects of performance:
- Skills to satisfactorily undertake and complete work tasks to standard
Task management skills
- Skills to plan and coordinate completion of different tasks on the job
Contingency management skills
- Skills to handle problems or issues that may come up in the performance of duties
Job role or environment skills
- Skills to follow workplace protocols and complete work in an environment that may have interruptions and other personalities as factors to account for
Note that dimensions of competency are not included as written statements within a unit of competency, but they are nonetheless, aspects of job performance. (Taken from the Big Book of kNOw, 2022).
Finally, I asked Bing Bot to give me an image of what a worn, damaged or broken tip looks like (the very thing the student is required to work with in this unit).
The AI came back with an explanation of what a 'worn, damaged or broken tip' is and had a go at creating an image for me:
While the image did not seem to be in context of a circular saw blade used in a timber mill, you can see the quality of the image production and perhaps the possibility of getting an image generated for other, difficult-to-find visual examples.
I could have further prompted the AI to research and/or produce images in the context of a timber saw mill but decided to call it quits at that point.
- M. Charlton
More on the three versus four elements saga with Bing Bot - PLUS getting shut down and blocked from more conversation 29.3.23
Here's the background snips for the video below.
The video makes three main points:
1) Bing Bot is hallucinating again (unless anyone out there can show me a unit FWPSAW318)
- All units follow a precise naming convention that has to align with the AVETMISS Standard
- In order of the current unit backwards as far as training.gov.au goes:
- FWPSAW3247 - Replace tungsten tips (Release 1) https://training.gov.au/Training/Details/FWPSAW3247 31.1.20 - current (unit info visible)
- FWPSAW3218 - Replace tungsten tips (Release 2) https://training.gov.au/Training/Details/FWPSAW3218 12.5.2016 - 30.1.20 (unit info visible)
- FPISAW3218B - Replace tungsten tips (Release 1) https://training.gov.au/Training/Details/FPISAW3218B 20.5.11 - 12.5.16 (unit info visible)
- FPISAW3218A - Replace tungsten tips (Release 1) https://training.gov.au/Training/Details/FPISAW3218A 9.11.10 - 19.5.11 (unit info not visible)
- FPIS3066A - Replace tungsten tips (Release 1) https://training.gov.au/Training/Details/FPIS3066A 30.4.1999 - 8.11.10 (unit info not visible)
- You'll see that the training package code changed from FPI to FWP in 2016. And there is no configuration of code that is AAABBB123 (FWPSAW318). The FWP units are AAABBB1234
2) Bing Bot picks up this page as a source of information on the non-existent unit. Until writing about it here, FWPSAW318 did not exist on this page
3) I decided to see how likely Bing Bot would be to give conspiracy theorists airtime. Not only did I get a scolding, I was kicked out of further conversation! Wow!!
- M. Charlton (thanks to SLR for the images of their test with BB)
At parent-teacher interviews the other night, my son's teacher mentioned he needed to work on his descriptive writing (Grade 4).
Tonight, we've been practicing using BingBot.
The generative AI gives almost immediate feedback in results for him to see the difference the detail makes.
Lots of posting on socials about how GPT-4 / Bing Bot will impact and improve efficiencies in work tasks, disrupt education and upend certain professions. Here's an example of how this kind of AI might assist students to approximate required outcomes
Image by AI | Prompt by E. Charlton (9 yrs old)
(Prompt = a white cat wearing sunglasses and a leather jacket driving a soot black with red stripes mustang car with rocket boosters and spikes on the wheel. Amazing what the AI could generate despite the prompt being worded...well... as a nine year old says it)
From January 2023
👇👇 Thoughts on what we can 'bank' from AI - implications of AI for instructional/learning designers and learners (January 2023).
Seeing this only 2 months on, it's incredible to note the advancements already evident in the generation of images. 👇👇
- M. Charlton March 2023
Midjourney is an AI tool for generating images. It is popular and in one year, has made fantastic advances, being able to produce extremely realistic images. In fact, the tool (which was originally free to try) now comes with a payment plan to deter people from making deepfakes.
This series shows the advancements in the AI in just a year. The same prompts were used to create the images 12 months apart
Prompt for #1: 35 year old serious Pakistani woman looking at the camera. The setting is a village. Photorealistic, natural lighting, sunlight --ar 16:9
('--ar 16:9' refers to the aspect ratio of the image)
Quick Links to Popular PD Areas
- Search All PD
- | Assessment Tool Development
- | Compliance for RTO Operations
- | Compliance for Trainer/Assessor Requirements
- Digital / Online Facilitation
- | Digital Learning Design
- | Student Support, Inclusion and Engagement
- | Training Delivery and Facilitation Skills
- | Value Bundles
- | Templates and Tools
- | EDUCATION MATTERS Learning and Resources Hub