ARTICLE AD BOX
Google’s Gemini 3 and OpenAI's ChatGPT were tested across three creative tasks—infographic design, portrait transformation and an India map graphic, to assess speed, visual quality and accuracy in real-world scenarios. Find out which AI tool can produce better graphics.

Google’s newly released Gemini 3 models have quickly risen to the top of major AI performance leaderboards, drawing strong industry reactions after an impressive early demo. Among the most notable praises was from Salesforce chief executive Marc Benioff, who described Gemini 3 as an “extraordinary step forward” and said that after spending just two hours with the model, he did not expect to return to ChatGPT.
To understand how the model performs in practical, everyday scenarios, both Gemini 3 and ChatGPT were tested across three creative and visual tasks involving infographics, image transformation and geographical labelling. The results offer a clear look at how both AI tools handle simple prompts as well as more challenging factual graphics.
Test 1: Infographic creation
Prompt:
Create a clean infographic with five sections on the topic “Climate-friendly habits for daily life”. Provide short titles, one-line descriptions and a simple visual layout with suggested icon styles.
How they performed
Both Gemini 3 and OpenAI's ChatGPT successfully generated the required infographic. However, their approaches differed significantly in speed and visual polish.
- Gemini 3: Produced the infographic in under one minute. The icons, layout and text appeared neat, structured and visually refined.
View full Image
- ChatGPT: Took four to five minutes to complete the graphic. While the content was correct, the visual quality was less polished, with weaker icons and a less cohesive layout.
View full Image
Test 2: Image-to-portrait transformation
Prompt:
Take the user’s uploaded image and convert it into a classic black-and-white studio portrait in a tuxedo with a bow tie, Rembrandt lighting, a dark velvet backdrop, a vintage camera aesthetic and accurate facial features.
How they performed
This test highlighted a major difference in image-handling ability.
- Gemini 3: Produced the stylised portrait within a minute, accurately preserving the user’s facial features and achieving the requested Hollywood-style look.
View full Image
- ChatGPT: Failed to match the user’s face, generating a portrait of an entirely different person. The final output did not meet the prompt’s requirement for accurate facial identity.
View full Image
Test 3: India map with labels and descriptions
Prompt:
Create a graphic showing a simple map of India with all states and union territories labelled, along with a one-line description highlighting a unique cultural, geographical or economic fact for each region.
How they performed
Both models struggled considerably with this prompt.
- Gemini 3: Produced a graphic but made errors in labels and factual descriptions.
View full Image
- ChatGPT: Also generated incorrect labels, mismatched content and inconsistent geographical placement.
View full Image
This test revealed that both tools currently face limitations when required to generate complex, fact-heavy graphics.
Final results: Side-by-side comparison
Test | Gemini 3 | ChatGPT |
| Test 1: Infographic creation | Faster output, cleaner layout, stronger icons | Slower output, weaker visuals |
| Test 2: Portrait transformation | Accurate result with correct facial features | Failed to maintain identity |
| Test 3: India map graphic | Incorrect labels and factual errors | Incorrect labels and factual errors |
Verdict
Across simple creative tasks, both AI tools can perform effectively, but when visuals become more demanding, differences emerge. Gemini 3 delivered faster, more refined results in the first two tests, while both systems struggled with accuracy in the final, more complex geography challenge.
Based on these evaluations, Gemini 3 outperformed ChatGPT in two out of the three tests, though both require improvements in handling detailed factual graphics.

1 month ago
3






English (US) ·