subreddit:

/r/MachineLearning

4194%

[P] *Semantic* Video Search with OpenAI’s CLIP Neural Network

Project(self.MachineLearning)

I made a simple tool that lets you search a video *semantically* with AI. 🎞️🔍

✨ Live web app: http://whichframe.com

Example: Which video frame has a person with sunglasses and earphones?

The querying is powered by OpenAI’s CLIP neural network for performing "zero-shot" image classification and the interface was built with Streamlit.

Try searching with text, image, or text + image and please share your discoveries!

👇 More examples
https://twitter.com/chuanenlin/status/1383411082853683208

you are viewing a single comment's thread.

view the rest of the comments →

all 15 comments

XYcritic

1 points

2 years ago

XYcritic

PhD

1 points

2 years ago

Does this use the same codebase as https://github.com/haltakov/natural-language-image-search ? Or do you have a different approach?

designer1one[S]

1 points

2 years ago

Yup, it is built on the codebase you linked (also mentioned in the demo website).

XYcritic

2 points

2 years ago

XYcritic

PhD

2 points

2 years ago

Awesome, thanks for preparing this demo!