Actions

ConferenceCall 2023 05 31: Difference between revisions

Ontolog Forum

No edit summary
Line 33: Line 33:


== Agenda ==
== Agenda ==
* Part I by [[JohnSowa|John Sowa]] [https://bit.ly/3N7pIBG Slides]
* Part I by [[JohnSowa|John Sowa]]  
# Strengths and limitations of GPT
# Strengths and limitations of GPT
# Large Language Models (LLMs)
# Large Language Models (LLMs)
# From perception to cognition
# From perception to cognition
* [https://bit.ly/3N7pIBG Slides]
* Part II by [[ArunMajumdar|Arun Majumdar]]: Permion technology and applications.
* Part II by [[ArunMajumdar|Arun Majumdar]]: Permion technology and applications.
* [https://bit.ly/3qdWvM5 Video Recording]


== Conference Call Information ==
== Conference Call Information ==
Line 53: Line 55:
* Chat room: http://webconf.soaphub.org/conf/room/ontology_summit_2023
* Chat room: http://webconf.soaphub.org/conf/room/ontology_summit_2023


== Attendees ==
== Participants ==
There were 40 attendees, including
* Anatol Reibold
* [[ArunMajumdar|Arun Majumdar]]
* [[BobbinTeegarden|Bobbin Teegarden]]
* [[DavidEddy|David Eddy]]
* [[GaryBergCross|Gary Berg-Cross]]
* [[JohnSowa|John Sowa]]
* Jack Park
* [[JamesOverton|James Overton]]
* [[JanetSinger|Janet Singer]]
* [[KenBaclawski|Ken Baclawski]]
* Ludmila Malahov
* Marc-Antoine Parent
* [[MarciaZeng|Marcia Zeng]]
* [[MikeBennett|Mike Bennett]]
* Phil Jackson
* [[RamSriram|Ram D. Sriram]]
* [[ToddSchneider|Todd Schneider]]


== Discussion ==
== Discussion ==


== Resources ==
== Resources ==
* [https://bit.ly/3N7pIBG Slides]
* [https://bit.ly/3qdWvM5 Video Recording]


[[Category:Icom_conf_Conference]]
[[Category:Icom_conf_Conference]]
[[Category:Occurrence| ]]
[[Category:Occurrence| ]]

Revision as of 18:53, 31 May 2023

Session GPT
Duration 1.5 hour
Date/Time 31 May 2023 16:00 GMT
9:00am PDT/12:00pm EDT
4:00pm GMT/5:00pm CST
Convener Ken Baclawski

Special Session of the Ontolog Forum on GPT

Evaluating and Reasoning with and about GPT

Large Language Models are derived from large volumes of texts stored on the WWW, and more texts acquired as they are used. GPT and related systems use the mathematical methods of tensor calculus to process LLMs in a wide range of AI applications. But the LLM methods are purely verbal. Their only connection to the world and human ways of thinking and acting is through the texts that people produce. Although quite useful, LLMs by themselves cannot support important AI methods of perception, action, reasoning, and cognition. For more general, precise, and reliable methods, they must be integrated with a broader range of AI technology.

Agenda

  1. Strengths and limitations of GPT
  2. Large Language Models (LLMs)
  3. From perception to cognition

Conference Call Information

  • Date: Wednesday, 31 May 2023
  • Start Time: 9:00am PDT / 12:00pm EDT / 6:00pm CEST / 5:00pm BST / 1600 UTC
  • Expected Call Duration: 1.5 hour
  • Video Conference URL
    • Conference ID: 837 8041 8377
    • Passcode: 323309
  • Chat Room

The unabbreviated URLs are:

Participants

There were 40 attendees, including

Discussion

Resources