*AI Summary*
The most appropriate group to review this material would be *Institutional Equity Analysts and Portfolio Managers* specializing in the Technology and Communication Services sectors. These professionals analyze earnings calls to assess fiscal health, evaluate management’s execution of strategic pivots (such as AI integration), and update valuation models based on Capital Expenditure (CapEx) guidance and margin trends.
Following is a high-fidelity summary of the Alphabet Q1 2025 Earnings Call from the perspective of a *Senior Equity Research Analyst.*
**
### *Executive Abstract*
Alphabet’s Q1 2025 results reflect a robust growth trajectory, characterized by a 12% increase in consolidated revenue ($90.2 billion) and significant margin expansion to 33.9%. The core Search business remains resilient with 10% growth, bolstered by the successful integration of AI Overviews, which now reaches 1.5 billion monthly users. Google Cloud continues its rapid ascent, posting 28% revenue growth and nearly doubling its operating margin year-over-year to 17.8%, driven by intense demand for AI infrastructure and the Gemini 2.5 model suite. While the company faces accelerating depreciation headwinds due to a projected $75 billion annual CapEx, management’s commitment to "durably re-engineering" the cost base and a new $70 billion share buyback program signal a disciplined approach to balancing aggressive AI investment with shareholder returns.
**
### *Q1 2025 Alphabet Earnings Analysis: Key Takeaways*
* *[05:22] AI Infrastructure and Model Dominance:* Alphabet highlighted the rollout of Gemini 2.5 Pro and Flash. The CEO noted that active users of the Gemini API and AI Studio grew over 200% since the start of the year. The infrastructure is supported by "Ironwood" (7th-gen TPU), which offers 10x the compute power of previous versions, alongside a strategic partnership with NVIDIA for Blackwell and Vera Rubin GPUs.
* *[08:52] Search Evolution and AI Overviews:* AI Overviews has officially reached 1.5 billion monthly users. Early data for "AI Mode" (a Labs experiment) indicates that queries are 2x longer than traditional search, suggesting a shift toward more complex, multi-modal, and nuanced user intent.
* *[11:04] Google Cloud Acceleration:* Cloud revenue hit $12.3 billion (+28% YoY). Operating income rose to $2.2 billion, reflecting an 8.4 percentage point margin expansion. Demand for AI training and inference currently exceeds capacity, prompting high utilization rates of the Vertex AI platform.
* *[11:45] YouTube and Subscription Milestones:* YouTube Music and Premium surpassed 125 million subscribers. Total Alphabet paid subscriptions reached 270 million. YouTube remains the #1 streaming platform in the U.S. by watch time for the second consecutive year.
* *[12:15] Waymo Operational Scaling:* Waymo is now facilitating over 250,000 paid passenger trips per week (a 5x YoY increase). Management emphasized a partnership-heavy model (e.g., Uber, Moove) to scale autonomous ride-hailing into new markets like Atlanta, Miami, and Washington, D.C.
* *[13:58] Ad Revenue Vertical Performance:* Search and Other revenue (+10%) was driven primarily by Financial Services (specifically Insurance) and Retail. YouTube ad revenue (+10%) saw a balanced contribution from direct response and brand advertising.
* *[15:58] Financial Performance and Capital Allocation:*
* *Revenue:* $90.2B (+12% YoY; +14% Constant Currency).
* *Operating Margin:* 33.9% (up from 31.6% YoY).
* *Capital Returns:* Announced a new $70 billion share repurchase authorization and a 5% increase in the quarterly dividend.
* *[19:10] CapEx and Depreciation Outlook:* Alphabet maintained its 2025 CapEx guidance of approximately $75 billion. CFO Anat Ashkenazi warned that depreciation growth—which was 31% in Q1—will accelerate throughout the year as new technical infrastructure is placed into service.
* *[22:38] Efficiency and Cost Re-engineering:* Management continues to focus on "moderating headcount growth" and optimizing real estate. Internal AI integration is reportedly improving productivity; specifically, 30% of new code checked in at Google is now AI-suggested.
* *[25:12] M&A Activity:* The company confirmed its intent to acquire Wiz, a cloud security platform, to bolster its multi-cloud security offerings and cybersecurity investigative workflows.
* *[36:20] Competitive Positioning (Q&A):* In response to analyst queries regarding the "Gemini vs. ChatGPT" DAU gap, CEO Pichai emphasized that Alphabet's primary AI touchpoint is through its 1.5 billion AI Overview users, arguing that embedding AI into existing high-traffic products is their primary driver for mass adoption.
AI-generated summary created with gemini-3-flash-preview for free via RocketRecap-dot-com. (Input: 23,049 tokens, Output: 1,107 tokens, Est. cost: $0.0148).Below, I will provide input for an example video (comprising of title, description, and transcript, in this order) and the corresponding abstract and summary I expect. Afterward, I will provide a new transcript that I want a summarization in the same format.
**Please give an abstract of the transcript and then summarize the transcript in a self-contained bullet list format.** Include starting timestamps, important details and key takeaways.
Example Input:
Fluidigm Polaris Part 2- illuminator and camera
mikeselectricstuff
131K subscribers
Subscribed
369
Share
Download
Clip
Save
5,857 views Aug 26, 2024
Fluidigm Polaris part 1 : • Fluidigm Polaris (Part 1) - Biotech g...
Ebay listings: https://www.ebay.co.uk/usr/mikeselect...
Merch https://mikeselectricstuff.creator-sp...
Transcript
Follow along using the transcript.
Show transcript
mikeselectricstuff
131K subscribers
Videos
About
Support on Patreon
40 Comments
@robertwatsonbath
6 hours ago
Thanks Mike. Ooof! - with the level of bodgery going on around 15:48 I think shame would have made me do a board re spin, out of my own pocket if I had to.
1
Reply
@Muonium1
9 hours ago
The green LED looks different from the others and uses phosphor conversion because of the "green gap" problem where green InGaN emitters suffer efficiency droop at high currents. Phosphide based emitters don't start becoming efficient until around 600nm so also can't be used for high power green emitters. See the paper and plot by Matthias Auf der Maur in his 2015 paper on alloy fluctuations in InGaN as the cause of reduced external quantum efficiency at longer (green) wavelengths.
4
Reply
1 reply
@tafsirnahian669
10 hours ago (edited)
Can this be used as an astrophotography camera?
Reply
mikeselectricstuff
·
1 reply
@mikeselectricstuff
6 hours ago
Yes, but may need a shutter to avoid light during readout
Reply
@2010craggy
11 hours ago
Narrowband filters we use in Astronomy (Astrophotography) are sided- they work best passing light in one direction so I guess the arrows on the filter frames indicate which way round to install them in the filter wheel.
1
Reply
@vitukz
12 hours ago
A mate with Channel @extractions&ire could use it
2
Reply
@RobertGallop
19 hours ago
That LED module says it can go up to 28 amps!!! 21 amps for 100%. You should see what it does at 20 amps!
Reply
@Prophes0r
19 hours ago
I had an "Oh SHIT!" moment when I realized that the weird trapezoidal shape of that light guide was for keystone correction of the light source.
Very clever.
6
Reply
@OneBiOzZ
20 hours ago
given the cost of the CCD you think they could have run another PCB for it
9
Reply
@tekvax01
21 hours ago
$20 thousand dollars per minute of run time!
1
Reply
@tekvax01
22 hours ago
"We spared no expense!" John Hammond Jurassic Park.
*(that's why this thing costs the same as a 50-seat Greyhound Bus coach!)
Reply
@florianf4257
22 hours ago
The smearing on the image could be due to the fact that you don't use a shutter, so you see brighter stripes under bright areas of the image as you still iluminate these pixels while the sensor data ist shifted out towards the top. I experienced this effect back at university with a LN-Cooled CCD for Spectroscopy. The stripes disapeared as soon as you used the shutter instead of disabling it in the open position (but fokussing at 100ms integration time and continuous readout with a focal plane shutter isn't much fun).
12
Reply
mikeselectricstuff
·
1 reply
@mikeselectricstuff
12 hours ago
I didn't think of that, but makes sense
2
Reply
@douro20
22 hours ago (edited)
The red LED reminds me of one from Roithner Lasertechnik. I have a Symbol 2D scanner which uses two very bright LEDs from that company, one red and one red-orange. The red-orange is behind a lens which focuses it into an extremely narrow beam.
1
Reply
@RicoElectrico
23 hours ago
PFG is Pulse Flush Gate according to the datasheet.
Reply
@dcallan812
23 hours ago
Very interesting. 2x
Reply
@littleboot_
1 day ago
Cool interesting device
Reply
@dav1dbone
1 day ago
I've stripped large projectors, looks similar, wonder if some of those castings are a magnesium alloy?
Reply
@kevywevvy8833
1 day ago
ironic that some of those Phlatlight modules are used in some of the cheapest disco lights.
1
Reply
1 reply
@bill6255
1 day ago
Great vid - gets right into subject in title, its packed with information, wraps up quickly. Should get a YT award! imho
3
Reply
@JAKOB1977
1 day ago (edited)
The whole sensor module incl. a 5 grand 50mpix sensor for 49 £.. highest bid atm
Though also a limited CCD sensor, but for the right buyer its a steal at these relative low sums.
Architecture Full Frame CCD (Square Pixels)
Total Number of Pixels 8304 (H) × 6220 (V) = 51.6 Mp
Number of Effective Pixels 8208 (H) × 6164 (V) = 50.5 Mp
Number of Active Pixels 8176 (H) × 6132 (V) = 50.1 Mp
Pixel Size 6.0 m (H) × 6.0 m (V)
Active Image Size 49.1 mm (H) × 36.8 mm (V)
61.3 mm (Diagonal),
645 1.1x Optical Format
Aspect Ratio 4:3
Horizontal Outputs 4
Saturation Signal 40.3 ke−
Output Sensitivity 31 V/e−
Quantum Efficiency
KAF−50100−CAA
KAF−50100−AAA
KAF−50100−ABA (with Lens)
22%, 22%, 16% (Peak R, G, B)
25%
62%
Read Noise (f = 18 MHz) 12.5 e−
Dark Signal (T = 60°C) 42 pA/cm2
Dark Current Doubling Temperature 5.7°C
Dynamic Range (f = 18 MHz) 70.2 dB
Estimated Linear Dynamic Range
(f = 18 MHz)
69.3 dB
Charge Transfer Efficiency
Horizontal
Vertical
0.999995
0.999999
Blooming Protection
(4 ms Exposure Time)
800X Saturation Exposure
Maximum Date Rate 18 MHz
Package Ceramic PGA
Cover Glass MAR Coated, 2 Sides or
Clear Glass
Features
• TRUESENSE Transparent Gate Electrode
for High Sensitivity
• Ultra-High Resolution
• Board Dynamic Range
• Low Noise Architecture
• Large Active Imaging Area
Applications
• Digitization
• Mapping/Aerial
• Photography
• Scientific
Thx for the tear down Mike, always a joy
Reply
@martinalooksatthings
1 day ago
15:49 that is some great bodging on of caps, they really didn't want to respin that PCB huh
8
Reply
@RhythmGamer
1 day ago
Was depressed today and then a new mike video dropped and now I’m genuinely happy to get my tear down fix
1
Reply
@dine9093
1 day ago (edited)
Did you transfrom into Mr Blobby for a moment there?
2
Reply
@NickNorton
1 day ago
Thanks Mike. Your videos are always interesting.
5
Reply
@KeritechElectronics
1 day ago
Heavy optics indeed... Spare no expense, cost no object. Splendid build quality. The CCD is a thing of beauty!
1
Reply
@YSoreil
1 day ago
The pricing on that sensor is about right, I looked in to these many years ago when they were still in production since it's the only large sensor you could actually buy. Really cool to see one in the wild.
2
Reply
@snik2pl
1 day ago
That leds look like from led projector
Reply
@vincei4252
1 day ago
TDI = Time Domain Integration ?
1
Reply
@wolpumba4099
1 day ago (edited)
Maybe the camera should not be illuminated during readout.
From the datasheet of the sensor (Onsemi): saturation 40300 electrons, read noise 12.5 electrons per pixel @ 18MHz (quite bad). quantum efficiency 62% (if it has micro lenses), frame rate 1 Hz. lateral overflow drain to prevent blooming protects against 800x (factor increases linearly with exposure time) saturation exposure (32e6 electrons per pixel at 4ms exposure time), microlens has +/- 20 degree acceptance angle
i guess it would be good for astrophotography
4
Reply
@txm100
1 day ago (edited)
Babe wake up a new mikeselectricstuff has dropped!
9
Reply
@vincei4252
1 day ago
That looks like a finger-lakes filter wheel, however, for astronomy they'd never use such a large stepper.
1
Reply
@MRooodddvvv
1 day ago
yaaaaay ! more overcomplicated optical stuff !
4
Reply
1 reply
@NoPegs
1 day ago
He lives!
11
Reply
1 reply
Transcript
0:00
so I've stripped all the bits of the
0:01
optical system so basically we've got
0:03
the uh the camera
0:05
itself which is mounted on this uh very
0:09
complex
0:10
adjustment thing which obviously to set
0:13
you the various tilt and uh alignment
0:15
stuff then there's two of these massive
0:18
lenses I've taken one of these apart I
0:20
think there's something like about eight
0:22
or nine Optical elements in here these
0:25
don't seem to do a great deal in terms
0:26
of electr magnification they're obiously
0:28
just about getting the image to where it
0:29
uh where it needs to be just so that
0:33
goes like that then this Optical block I
0:36
originally thought this was made of some
0:37
s crazy heavy material but it's just
0:39
really the sum of all these Optical bits
0:41
are just ridiculously heavy those lenses
0:43
are about 4 kilos each and then there's
0:45
this very heavy very solid um piece that
0:47
goes in the middle and this is so this
0:49
is the filter wheel assembly with a
0:51
hilariously oversized steper
0:53
motor driving this wheel with these very
0:57
large narrow band filters so we've got
1:00
various different shades of uh
1:03
filters there five Al together that
1:06
one's actually just showing up a silver
1:07
that's actually a a red but fairly low
1:10
transmission orangey red blue green
1:15
there's an excess cover on this side so
1:16
the filters can be accessed and changed
1:19
without taking anything else apart even
1:21
this is like ridiculous it's like solid
1:23
aluminium this is just basically a cover
1:25
the actual wavelengths of these are um
1:27
488 525 570 630 and 700 NM not sure what
1:32
the suffix on that perhaps that's the uh
1:34
the width of the spectral line say these
1:37
are very narrow band filters most of
1:39
them are you very little light through
1:41
so it's still very tight narrow band to
1:43
match the um fluoresence of the dies
1:45
they're using in the biochemical process
1:48
and obviously to reject the light that's
1:49
being fired at it from that Illuminator
1:51
box and then there's a there's a second
1:53
one of these lenses then the actual sort
1:55
of samples below that so uh very serious
1:58
amount of very uh chunky heavy Optics
2:01
okay let's take a look at this light
2:02
source made by company Lumen Dynamics
2:04
who are now part of
2:06
excelitas self-contained unit power
2:08
connector USB and this which one of the
2:11
Cable Bundle said was a TTL interface
2:14
USB wasn't used in uh the fluid
2:17
application output here and I think this
2:19
is an input for um light feedback I
2:21
don't if it's regulated or just a measur
2:23
measurement facility and the uh fiber
2:27
assembly
2:29
Square Inlet there and then there's two
2:32
outputs which have uh lens assemblies
2:35
and this small one which goes back into
2:37
that small Port just Loops out of here
2:40
straight back in So on this side we've
2:42
got the electronics which look pretty
2:44
straightforward we've got a bit of power
2:45
supply stuff over here and we've got
2:48
separate drivers for each wavelength now
2:50
interesting this is clearly been very
2:52
specifically made for this application
2:54
you I was half expecting like say some
2:56
generic drivers that could be used for a
2:58
number of different things but actually
3:00
literally specified the exact wavelength
3:02
on the PCB there is provision here for
3:04
385 NM which isn't populated but this is
3:07
clearly been designed very specifically
3:09
so these four drivers look the same but
3:10
then there's two higher power ones for
3:12
575 and
3:14
520 a slightly bigger heat sink on this
3:16
575 section there a p 24 which is
3:20
providing USB interface USB isolator the
3:23
USB interface just presents as a comport
3:26
I did have a quick look but I didn't
3:27
actually get anything sensible um I did
3:29
dump the Pi code out and there's a few
3:31
you a few sort of commands that you
3:32
could see in text but I didn't actually
3:34
manage to get it working properly I
3:36
found some software for related version
3:38
but it didn't seem to want to talk to it
3:39
but um I say that wasn't used for the
3:41
original application it might be quite
3:42
interesting to get try and get the Run
3:44
hours count out of it and the TTL
3:46
interface looks fairly straightforward
3:48
we've got positions for six opto
3:50
isolators but only five five are
3:52
installed so that corresponds with the
3:54
unused thing so I think this hopefully
3:56
should be as simple as just providing a
3:57
ttrl signal for each color to uh enable
4:00
it a big heat sink here which is there I
4:03
think there's like a big S of metal
4:04
plate through the middle of this that
4:05
all the leads are mounted on the other
4:07
side so this is heat sinking it with a
4:09
air flow from a uh just a fan in here
4:13
obviously don't have the air flow
4:14
anywhere near the Optics so conduction
4:17
cool through to this plate that's then
4:18
uh air cooled got some pots which are
4:21
presumably power
4:22
adjustments okay let's take a look at
4:24
the other side which is uh much more
4:27
interesting see we've got some uh very
4:31
uh neatly Twisted cable assemblies there
4:35
a bunch of leads so we've got one here
4:37
475 up here 430 NM 630 575 and 520
4:44
filters and dcro mirrors a quick way to
4:48
see what's white is if we just shine
4:49
some white light through
4:51
here not sure how it is is to see on the
4:54
camera but shining white light we do
4:55
actually get a bit of red a bit of blue
4:57
some yellow here so the obstacle path
5:00
575 it goes sort of here bounces off
5:03
this mirror and goes out the 520 goes
5:07
sort of down here across here and up
5:09
there 630 goes basically straight
5:13
through
5:15
430 goes across there down there along
5:17
there and the 475 goes down here and
5:20
left this is the light sensing thing
5:22
think here there's just a um I think
5:24
there a photo diode or other sensor
5:26
haven't actually taken that off and
5:28
everything's fixed down to this chunk of
5:31
aluminium which acts as the heat
5:32
spreader that then conducts the heat to
5:33
the back side for the heat
5:35
sink and the actual lead packages all
5:38
look fairly similar except for this one
5:41
on the 575 which looks quite a bit more
5:44
substantial big spay
5:46
Terminals and the interface for this
5:48
turned out to be extremely simple it's
5:50
literally a 5V TTL level to enable each
5:54
color doesn't seem to be any tensity
5:56
control but there are some additional
5:58
pins on that connector that weren't used
5:59
in the through time thing so maybe
6:01
there's some extra lines that control
6:02
that I couldn't find any data on this uh
6:05
unit and the um their current product
6:07
range is quite significantly different
6:09
so we've got the uh blue these
6:13
might may well be saturating the camera
6:16
so they might look a bit weird so that's
6:17
the 430
6:18
blue the 575
6:24
yellow uh
6:26
475 light blue
6:29
the uh 520
6:31
green and the uh 630 red now one
6:36
interesting thing I noticed for the
6:39
575 it's actually it's actually using a
6:42
white lead and then filtering it rather
6:44
than using all the other ones are using
6:46
leads which are the fundamental colors
6:47
but uh this is actually doing white and
6:50
it's a combination of this filter and
6:52
the dichroic mirrors that are turning to
6:55
Yellow if we take the filter out and a
6:57
lot of the a lot of the um blue content
7:00
is going this way the red is going
7:02
straight through these two mirrors so
7:05
this is clearly not reflecting much of
7:08
that so we end up with the yellow coming
7:10
out of uh out of there which is a fairly
7:14
light yellow color which you don't
7:16
really see from high intensity leads so
7:19
that's clearly why they've used the
7:20
white to uh do this power consumption of
7:23
the white is pretty high so going up to
7:25
about 2 and 1 half amps on that color
7:27
whereas most of the other colors are
7:28
only drawing half an amp or so at 24
7:30
volts the uh the green is up to about
7:32
1.2 but say this thing is uh much
7:35
brighter and if you actually run all the
7:38
colors at the same time you get a fairly
7:41
reasonable um looking white coming out
7:43
of it and one thing you might just be
7:45
out to notice is there is some sort
7:46
color banding around here that's not
7:49
getting uh everything s completely
7:51
concentric and I think that's where this
7:53
fiber optic thing comes
7:58
in I'll
8:00
get a couple of Fairly accurately shaped
8:04
very sort of uniform color and looking
8:06
at What's um inside here we've basically
8:09
just got this Square Rod so this is
8:12
clearly yeah the lights just bouncing
8:13
off all the all the various sides to um
8:16
get a nice uniform illumination uh this
8:19
back bit looks like it's all potted so
8:21
nothing I really do to get in there I
8:24
think this is fiber so I have come
8:26
across um cables like this which are
8:27
liquid fill but just looking through the
8:30
end of this it's probably a bit hard to
8:31
see it does look like there fiber ends
8:34
going going on there and so there's this
8:36
feedback thing which is just obviously
8:39
compensating for the any light losses
8:41
through here to get an accurate
8:43
representation of uh the light that's
8:45
been launched out of these two
8:47
fibers and you see uh
8:49
these have got this sort of trapezium
8:54
shape light guides again it's like a
8:56
sort of acrylic or glass light guide
9:00
guess projected just to make the right
9:03
rectangular
9:04
shape and look at this Center assembly
9:07
um the light output doesn't uh change
9:10
whether you feed this in or not so it's
9:11
clear not doing any internal Clos Loop
9:14
control obviously there may well be some
9:16
facility for it to do that but it's not
9:17
being used in this
9:19
application and so this output just
9:21
produces a voltage on the uh outle
9:24
connector proportional to the amount of
9:26
light that's present so there's a little
9:28
diffuser in the back there
9:30
and then there's just some kind of uh
9:33
Optical sensor looks like a
9:35
chip looking at the lead it's a very
9:37
small package on the PCB with this lens
9:40
assembly over the top and these look
9:43
like they're actually on a copper
9:44
Metalized PCB for maximum thermal
9:47
performance and yeah it's a very small
9:49
package looks like it's a ceramic
9:51
package and there's a thermister there
9:53
for temperature monitoring this is the
9:56
475 blue one this is the 520 need to
9:59
Green which is uh rather different OB
10:02
it's a much bigger D with lots of bond
10:04
wise but also this looks like it's using
10:05
a phosphor if I shine a blue light at it
10:08
lights up green so this is actually a
10:10
phosphor conversion green lead which
10:12
I've I've come across before they want
10:15
that specific wavelength so they may be
10:17
easier to tune a phosphor than tune the
10:20
um semiconductor material to get the uh
10:23
right right wavelength from the lead
10:24
directly uh red 630 similar size to the
10:28
blue one or does seem to have a uh a
10:31
lens on top of it there is a sort of red
10:33
coloring to
10:35
the die but that doesn't appear to be
10:38
fluorescent as far as I can
10:39
tell and the white one again a little
10:41
bit different sort of much higher
10:43
current
10:46
connectors a makeer name on that
10:48
connector flot light not sure if that's
10:52
the connector or the lead
10:54
itself and obviously with the phosphor
10:56
and I'd imagine that phosphor may well
10:58
be tuned to get the maximum to the uh 5
11:01
cenm and actually this white one looks
11:04
like a St fairly standard product I just
11:06
found it in Mouse made by luminous
11:09
devices in fact actually I think all
11:11
these are based on various luminous
11:13
devices modules and they're you take
11:17
looks like they taking the nearest
11:18
wavelength and then just using these
11:19
filters to clean it up to get a precise
11:22
uh spectral line out of it so quite a
11:25
nice neat and um extreme
11:30
bright light source uh sure I've got any
11:33
particular use for it so I think this
11:35
might end up on
11:36
eBay but uh very pretty to look out and
11:40
without the uh risk of burning your eyes
11:43
out like you do with lasers so I thought
11:45
it would be interesting to try and
11:46
figure out the runtime of this things
11:48
like this we usually keep some sort
11:49
record of runtime cuz leads degrade over
11:51
time I couldn't get any software to work
11:52
through the USB face but then had a
11:54
thought probably going to be writing the
11:55
runtime periodically to the e s prom so
11:58
I just just scope up that and noticed it
12:00
was doing right every 5 minutes so I
12:02
just ran it for a while periodically
12:04
reading the E squ I just held the pick
12:05
in in reset and um put clip over to read
12:07
the square prom and found it was writing
12:10
one location per color every 5 minutes
12:12
so if one color was on it would write
12:14
that location every 5 minutes and just
12:16
increment it by one so after doing a few
12:18
tests with different colors of different
12:19
time periods it looked extremely
12:21
straightforward it's like a four bite
12:22
count for each color looking at the
12:24
original data that was in it all the
12:26
colors apart from Green were reading
12:28
zero and the green was reading four
12:30
indicating a total 20 minutes run time
12:32
ever if it was turned on run for a short
12:34
time then turned off that might not have
12:36
been counted but even so indicates this
12:37
thing wasn't used a great deal the whole
12:40
s process of doing a run can be several
12:42
hours but it'll only be doing probably
12:43
the Imaging at the end of that so you
12:46
wouldn't expect to be running for a long
12:47
time but say a single color for 20
12:50
minutes over its whole lifetime does
12:52
seem a little bit on the low side okay
12:55
let's look at the camera un fortunately
12:57
I managed to not record any sound when I
12:58
did this it's also a couple of months
13:00
ago so there's going to be a few details
13:02
that I've forgotten so I'm just going to
13:04
dub this over the original footage so um
13:07
take the lid off see this massive great
13:10
heat sink so this is a pel cool camera
13:12
we've got this blower fan producing a
13:14
fair amount of air flow through
13:16
it the connector here there's the ccds
13:19
mounted on the board on the
13:24
right this unplugs so we've got a bit of
13:27
power supply stuff on here
13:29
USB interface I think that's the Cyprus
13:32
microcontroller High speeded USB
13:34
interface there's a zyink spon fpga some
13:40
RAM and there's a couple of ATD
13:42
converters can't quite read what those
13:45
those are but anal
13:47
devices um little bit of bodgery around
13:51
here extra decoupling obviously they
13:53
have having some noise issues this is
13:55
around the ram chip quite a lot of extra
13:57
capacitors been added there
13:59
uh there's a couple of amplifiers prior
14:01
to the HD converter buffers or Andor
14:05
amplifiers taking the CCD
14:08
signal um bit more power spy stuff here
14:11
this is probably all to do with
14:12
generating the various CCD bias voltages
14:14
they uh need quite a lot of exotic
14:18
voltages next board down is just a
14:20
shield and an interconnect
14:24
boardly shielding the power supply stuff
14:26
from some the more sensitive an log
14:28
stuff
14:31
and this is the bottom board which is
14:32
just all power supply
14:34
stuff as you can see tons of capacitors
14:37
or Transformer in
14:42
there and this is the CCD which is a uh
14:47
very impressive thing this is a kf50 100
14:50
originally by true sense then codec
14:53
there ON
14:54
Semiconductor it's 50 megapixels uh the
14:58
only price I could find was this one
15:00
5,000 bucks and the architecture you can
15:03
see there actually two separate halves
15:04
which explains the Dual AZ converters
15:06
and two amplifiers it's literally split
15:08
down the middle and duplicated so it's
15:10
outputting two streams in parallel just
15:13
to keep the bandwidth sensible and it's
15:15
got this amazing um diffraction effects
15:18
it's got micro lenses over the pixel so
15:20
there's there's a bit more Optics going
15:22
on than on a normal
15:25
sensor few more bodges on the CCD board
15:28
including this wire which isn't really
15:29
tacked down very well which is a bit uh
15:32
bit of a mess quite a few bits around
15:34
this board where they've uh tacked
15:36
various bits on which is not super
15:38
impressive looks like CCD drivers on the
15:40
left with those 3 ohm um damping
15:43
resistors on the
15:47
output get a few more little bodges
15:50
around here some of
15:52
the and there's this separator the
15:54
silica gel to keep the moisture down but
15:56
there's this separator that actually
15:58
appears to be cut from piece of
15:59
antistatic
16:04
bag and this sort of thermal block on
16:06
top of this stack of three pel Cola
16:12
modules so as with any Stacks they get
16:16
um larger as they go back towards the
16:18
heat sink because each P's got to not
16:20
only take the heat from the previous but
16:21
also the waste heat which is quite
16:27
significant you see a little temperature
16:29
sensor here that copper block which
16:32
makes contact with the back of the
16:37
CCD and this's the back of the
16:40
pelas this then contacts the heat sink
16:44
on the uh rear there a few thermal pads
16:46
as well for some of the other power
16:47
components on this
16:51
PCB okay I've connected this uh camera
16:54
up I found some drivers on the disc that
16:56
seem to work under Windows 7 couldn't
16:58
get to install under Windows 11 though
17:01
um in the absence of any sort of lens or
17:03
being bothered to the proper amount I've
17:04
just put some f over it and put a little
17:06
pin in there to make a pinhole lens and
17:08
software gives a few options I'm not
17:11
entirely sure what all these are there's
17:12
obviously a clock frequency 22 MHz low
17:15
gain and with PFG no idea what that is
17:19
something something game programmable
17:20
Something game perhaps ver exposure
17:23
types I think focus is just like a
17:25
continuous grab until you tell it to
17:27
stop not entirely sure all these options
17:30
are obviously exposure time uh triggers
17:33
there ex external hardware trigger inut
17:35
you just trigger using a um thing on
17:37
screen so the resolution is 8176 by
17:40
6132 and you can actually bin those
17:42
where you combine multiple pixels to get
17:46
increased gain at the expense of lower
17:48
resolution down this is a 10sec exposure
17:51
obviously of the pin hole it's very uh
17:53
intensitive so we just stand still now
17:56
downloading it there's the uh exposure
17:59
so when it's
18:01
um there's a little status thing down
18:03
here so that tells you the um exposure
18:07
[Applause]
18:09
time it's this is just it
18:15
downloading um it is quite I'm seeing
18:18
quite a lot like smearing I think that I
18:20
don't know whether that's just due to
18:21
pixels overloading or something else I
18:24
mean yeah it's not it's not um out of
18:26
the question that there's something not
18:27
totally right about this camera
18:28
certainly was bodge wise on there um I
18:31
don't I'd imagine a camera like this
18:32
it's got a fairly narrow range of
18:34
intensities that it's happy with I'm not
18:36
going to spend a great deal of time on
18:38
this if you're interested in this camera
18:40
maybe for astronomy or something and
18:42
happy to sort of take the risk of it may
18:44
not be uh perfect I'll um I think I'll
18:47
stick this on eBay along with the
18:48
Illuminator I'll put a link down in the
18:50
description to the listing take your
18:52
chances to grab a bargain so for example
18:54
here we see this vertical streaking so
18:56
I'm not sure how normal that is this is
18:58
on fairly bright scene looking out the
19:02
window if I cut the exposure time down
19:04
on that it's now 1 second
19:07
exposure again most of the image
19:09
disappears again this is looks like it's
19:11
possibly over still overloading here go
19:14
that go down to say say quarter a
19:16
second so again I think there might be
19:19
some Auto gain control going on here um
19:21
this is with the PFG option let's try
19:23
turning that off and see what
19:25
happens so I'm not sure this is actually
19:27
more streaking or which just it's
19:29
cranked up the gain all the dis display
19:31
gray scale to show what um you know the
19:33
range of things that it's captured
19:36
there's one of one of 12 things in the
19:38
software there's um you can see of you
19:40
can't seem to read out the temperature
19:42
of the pelta cooler but you can set the
19:44
temperature and if you said it's a
19:46
different temperature you see the power
19:48
consumption jump up running the cooler
19:50
to get the temperature you requested but
19:52
I can't see anything anywhere that tells
19:54
you whether the cool is at the at the
19:56
temperature other than the power
19:57
consumption going down and there's no
19:59
temperature read out
20:03
here and just some yeah this is just
20:05
sort of very basic software I'm sure
20:07
there's like an API for more
20:09
sophisticated
20:10
applications but so if you know anything
20:12
more about these cameras please um stick
20:14
in the
20:15
comments um incidentally when I was
20:18
editing I didn't notice there was a bent
20:19
pin on the um CCD but I did fix that
20:22
before doing these tests and also
20:24
reactivated the um silica gel desicant
20:26
cuz I noticed it was uh I was getting
20:28
bit of condensation on the window but um
20:31
yeah so a couple of uh interesting but
20:34
maybe not particularly uh useful pieces
20:37
of Kit except for someone that's got a
20:38
very specific use so um I'll stick a
20:42
I'll stick these on eBay put a link in
20:44
the description and say hopefully
20:45
someone could actually make some uh good
20:47
use of these things
Example Output:
**Abstract:**
This video presents Part 2 of a teardown focusing on the optical components of a Fluidigm Polaris biotechnology instrument, specifically the multi-wavelength illuminator and the high-resolution CCD camera.
The Lumen Dynamics illuminator unit is examined in detail, revealing its construction using multiple high-power LEDs (430nm, 475nm, 520nm, 575nm, 630nm) combined via dichroic mirrors and filters. A square fiber optic rod is used to homogenize the light. A notable finding is the use of a phosphor-converted white LED filtered to achieve the 575nm output. The unit features simple TTL activation for each color, conduction cooling, and internal homogenization optics. Analysis of its EEPROM suggests extremely low operational runtime.
The camera module teardown showcases a 50 Megapixel ON Semiconductor KAF-50100 CCD sensor with micro-lenses, cooled by a multi-stage Peltier stack. The control electronics include an FPGA and a USB interface. Significant post-manufacturing modifications ("bodges") are observed on the camera's circuit boards. Basic functional testing using vendor software and a pinhole lens confirms image capture but reveals prominent vertical streaking artifacts, the cause of which remains uncertain (potential overload, readout artifact, or fault).
**Exploring the Fluidigm Polaris: A Detailed Look at its High-End Optics and Camera System**
* **0:00 High-End Optics:** The system utilizes heavy, high-quality lenses and mirrors for precise imaging, weighing around 4 kilos each.
* **0:49 Narrow Band Filters:** A filter wheel with five narrow band filters (488, 525, 570, 630, and 700 nm) ensures accurate fluorescence detection and rejection of excitation light.
* **2:01 Customizable Illumination:** The Lumen Dynamics light source offers five individually controllable LED wavelengths (430, 475, 520, 575, 630 nm) with varying power outputs. The 575nm yellow LED is uniquely achieved using a white LED with filtering.
* **3:45 TTL Control:** The light source is controlled via a simple TTL interface, enabling easy on/off switching for each LED color.
* **12:55 Sophisticated Camera:** The system includes a 50-megapixel Kodak KAI-50100 CCD camera with a Peltier cooling system for reduced noise.
* **14:54 High-Speed Data Transfer:** The camera features dual analog-to-digital converters to manage the high data throughput of the 50-megapixel sensor, which is effectively two 25-megapixel sensors operating in parallel.
* **18:11 Possible Issues:** The video creator noted some potential issues with the camera, including image smearing.
* **18:11 Limited Dynamic Range:** The camera's sensor has a limited dynamic range, making it potentially challenging to capture scenes with a wide range of brightness levels.
* **11:45 Low Runtime:** Internal data suggests the system has seen minimal usage, with only 20 minutes of recorded runtime for the green LED.
* **20:38 Availability on eBay:** Both the illuminator and camera are expected to be listed for sale on eBay.
Here is the real transcript. What would be a good group of people to review this topic? Please summarize provide a summary like they would:
Skip to main content
Results & Financials
Updates
Governance
Other
EVENTS
2025 Q1 Earnings Call
April 24, 2025 1:30 PM | US/Pacific
Transcript
For a PDF version of the transcript, please click here.
This transcript is provided for the convenience of investors only, for a full recording please see the Q1 2025 Earnings Call webcast.
Operator: Welcome, everyone. Thank you for standing by for the Alphabet First Quarter 2025 Earnings conference call.
At this time, all participants are in a listen-only mode. After the speaker presentations, there will be a question-and-answer session. To ask a question during the session, you will need to press *1 on your telephone.
I would now like to hand the conference over to your speaker today, Jim Friedland, Senior Director of Investor Relations. Please go ahead.
Jim Friedland, Senior Director, Investor Relations: Thank you. Good afternoon, everyone, and welcome to Alphabet’s First Quarter 2025 Earnings Conference Call. With us today are Sundar Pichai, Philipp Schindler and Anat Ashkenazi.
Now, I’ll quickly cover the Safe Harbor.
Some of the statements that we make today regarding our business, operations, and financial performance may be considered forward-looking. Such statements are based on current expectations and assumptions that are subject to a number of risks and uncertainties.
Actual results could differ materially. Please refer to our Forms 10-K and 10-Q, including the risk factors. We undertake no obligation to update any forward-looking statement.
During this call, we will present both GAAP and non-GAAP financial measures. A reconciliation of non-GAAP to GAAP measures is included in today’s earnings press release, which is distributed and available to the public through our Investor Relations website located at abc.xyz/investor.
Our comments will be on year-over-year comparisons unless we state otherwise.
And now, I’ll turn the call over to Sundar.
Sundar Pichai, CEO, Alphabet and Google: Thanks, Jim. Good afternoon, everyone.
We’re pleased with our strong results this quarter. We continued to see healthy growth and momentum across the business, including AI powering new features.
In Search, we saw continued double-digit revenue growth. AI Overviews is going very well with over 1.5 billion users per month, and we are excited by the early positive reaction to AI Mode. There’s a lot more to come ahead.
In Subscriptions, we surpassed 270 million paid subscriptions, with YouTube and Google One as key drivers.
And Cloud grew rapidly with significant demand for our solutions, and you saw our leadership in AI at Cloud Next across infrastructure, agents, and more.
Our differentiated, full-stack approach to AI continues to be central to our growth. This quarter was super exciting as we rolled out Gemini 2.5, our most intelligent AI model, which is achieving breakthroughs in performance, and it’s widely recognized as the best model in the industry. That’s an extraordinary foundation for our future innovation. And we are focused on bringing this to people and customers everywhere.
Looking ahead to I/O, Brandcast, and Google Marketing Live, I can’t wait for our teams to showcase the innovations they have been working on.
Turning to our AI progress this quarter, which continues to enable significant growth opportunities. The elements of the AI stack I’ve previously mentioned are AI infrastructure; world-class research, including models and tooling; and our products and platforms.
Starting with AI infrastructure, our long-term investments in our global network have positioned us well. Google’s network is robust and resilient, supported by over two-million miles of fiber and thirty-three subsea cables. Complementing this, we offer the industry’s widest range of TPUs and GPUs, and continue to invest in next-generation capabilities.
Ironwood, our seventh-generation TPU and most powerful to date, is the first designed specifically for inference at scale. It delivers more than 10X improvement in compute power over our recent high-performance TPU, while being nearly twice as power efficient.
Our strong relationship with NVIDIA continues to be a key advantage for us and our customers. We were the first cloud provider to offer NVIDIA’s groundbreaking B200 and GB200 Blackwell GPUs, and will be offering their next-generation Vera Rubin GPUs.
Second, this infrastructure powers our world-class research, including our industry-leading models. We released Gemini 2.5 Pro last month, receiving extremely positive feedback from both developers and consumers.
2.5 Pro is state-of-the-art on a wide range of benchmarks and debuted at number one on the Chatbot Arena by a significant margin.
2.5 Pro achieved big leaps in reasoning, coding, science and math capabilities, opening up new possibilities for developers and customers. Active users in AI Studio and Gemini API have grown over 200% since the beginning of the year.
And last week, we introduced 2.5 Flash, which enables developers to optimize quality and cost.
Our latest image and video generation models, Imagen 3 and Veo 2, are rolling out broadly and are powering incredible creativity.
Turning to open models, we launched Gemma 3 last month, delivering state-of-the-art performance for its size. Gemma models have been downloaded more than 140 million times.
Lastly, we are developing AI models in new areas where there’s enormous opportunity. For example, our new Gemini Robotics models.
And in health, we launched AI Co-Scientist, a multi-agent AI research system, while AlphaFold has now been used by over 2.5 million researchers.
Third, turning to products and platforms. All 15 of our products with a half a billion users now use Gemini models. Android and Pixel are two examples of how we are putting the best AI in people’s hands, making it super easy to use AI for a wide range of tasks, just by using their camera, voice, or taking a screenshot.
We are upgrading Google Assistant on mobile devices to Gemini; and later this year, we’ll upgrade tablets, cars and devices that connect to your phone, such as headphones and watches.
The Pixel 9a launched to very strong reviews, providing the best of Google’s AI offerings, like Gemini Live and AI-powered camera features.
And Gemini Live camera and screen sharing is now rolling out to all Android devices, including Pixel and Samsung S25.
Now, moving on to key highlights from across Search, Cloud, YouTube and Waymo.
First, Search. AI is one of the most revolutionary technologies for enabling and expanding our information mission. And for Search, we see it growing the number and types of questions we can answer. We’re already seeing this with AI Overviews, which now has more than 1.5 billion users every month.
Nearly a year after we launched AI Overviews in the U.S., we continue to see that usage growth is increasing as people learn that Search is more useful for more of their queries. So we’re leaning in heavily here, continuing to roll the feature out in new countries, to more users, and to more queries.
Building on the positive feedback for AI Overviews, in March we released AI Mode, an experiment in Labs. It expands what AI Overviews can do with more advanced reasoning, thinking, and multimodal capabilities to help with questions that need further exploration and comparisons. On average, AI Mode queries are twice as long as traditional Search queries.
We are getting really positive feedback from early users about its design, fast response time, and ability to understand complex, nuanced questions.
We also continue to see significant growth in multimodal queries. Circle to Search is now available on more than 250 million devices, with usage increasing nearly 40% this quarter. And monthly visual searches with Lens have increased by five billion since October.
Moving on to Cloud. At Cloud Next, we announced major innovations, and over 500 hundred companies shared the business results they are achieving by working with us.
We provide leading cost, performance, and reliability for AI training and inference. This enables us to deliver the best value for AI leaders, like Anyscale and Contextual AI, as well as global brands like Verizon. And for highly sensitive data and regulatory requirements, Google Distributed Cloud and our Sovereign AI make Gemini available on premises or in country.
Our Vertex AI Platform makes over 200 foundation models available, helping customers, like Loews, integrate AI.
We offer industry-leading models, including Gemini 2.5 Pro, 2.5 Flash, Imagen 3, Veo 2, Chirp, and Lyria. Plus open-source and third-party models like Llama 4 and Anthropic.
We are the leading cloud solution for companies looking to the new era of AI agents; a big opportunity. Our Agent Development Kit is a new open-source framework to simplify the process of building sophisticated AI agents and multi-agent systems.
And Agent Designer is a low-code tool to build AI agents and automate tasks in over 100 enterprise applications and systems.
We are putting AI agents in the hands of employees at major global companies, like KPMG. With Google Agentspace, employees can find and synthesize information from within their organization, converse with AI agents, and take action with their enterprise applications. It combines enterprise search, conversational AI or chat and access to Gemini and third-party agents.
We also offer prepackaged agents across customer engagement, coding, creativity and more that are helping to provide conversational customer experiences, accelerate software development, and improve decision-making.
And, of course, Google Workspace. It delivers more than 2 billion AI assists monthly, including summarizing Gmail and refining Docs.
Lastly, our cybersecurity products are helping organizations detect, investigate, and respond to cybersecurity threats. Our expertise, coupled with integrated Gemini AI advances, detects malware, prioritizes threats, and speeds up investigative workflows.
This quarter, we were excited to announce our intent to acquire Wiz, a leading cloud security platform that protects all major clouds and code environments. Together we can make it easier and faster for organizations of all types and sizes to protect themselves end-to-end and across all major clouds. We think this will help spur more multi-cloud computing, something customers want.
Next, YouTube. Yesterday marked a historic milestone: the 20th anniversary of the first video uploaded to YouTube. From that single 19-second upload the platform has grown into a global phenomenon, fundamentally changing how billions of people create, share, and experience content.
Through all this growth, subscriptions are now a big part of the business. We continue to diversify subscription options, recently expanding our Premium Lite pilot to the U.S, giving users a new way to enjoy most videos on YouTube ad-free.
TV is the primary device for YouTube viewing in the U.S. According to Nielsen, YouTube has been number one in streaming watch time in the U.S. for the last two years. And YouTube now has over one billion monthly active podcast users.
YouTube Music and Premium reached over 125 million subscribers, including trials, globally.
And finally, Waymo is now safely serving over a quarter of a million paid passenger trips each week. That’s up 5X from a year ago.
This past quarter, Waymo opened up paid service in Silicon Valley. Through our partnership with Uber, we expanded in Austin and are preparing for our public launch in Atlanta later this summer. We recently announced Washington, D.C., as a future ride-hailing city, going live in 2026, alongside Miami. Waymo continues progressing on two important capabilities for riders; airport access and freeway driving.
Thanks to all of our employees for their work this quarter. It was a great start to the year, and Q2 will be even more exciting.
With that, Philipp, over to you.
Philipp Schindler, SVP and CBO, Google: Thanks, Sundar, and hello, everyone.
I’ll quickly cover performance for the quarter, and then frame the rest of my remarks around the progress we are delivering across Search, Ads, YouTube and Partnerships.
Google Services revenues were $77 billion for the quarter, up 10% year-on-year, driven by strong growth in Search and YouTube, partially offset by a year-on-year decline in Network revenues.
To add some further color to the performance, the 10% increase in Search and Other revenues was led by Financial Services, primarily due to strength in Insurance, followed by Retail.
YouTube saw similar performance across verticals. Its 10% growth in advertising revenues was driven by direct response, followed by brand.
So, let’s start with Search, where we’ve seen robust growth in revenues. All around the world, over 2 billion people use Search every day to find information, compare products or shop. And there are more than 5 trillion searches on Google annually.
We’ve continued our efforts to help more people ask entirely new questions, bringing more opportunities for businesses to connect with consumers, and as we’ve mentioned before, with the launch of AI Overviews, the volume of commercial queries has increased.
Q1 marked our largest expansion to date for AI Overviews, both in terms of launching to new users and providing responses for more questions. The feature is now available in more than 15 languages across 140 countries.
For AI Overviews, overall, we continue to see monetization at approximately the same rate, which gives us a strong base on which we can innovate even more.
Turning to visual queries. On the last Earnings Call, I mentioned the success we’re seeing with Lens, where shoppers use their camera or images to quickly find information in ways they couldn’t before. In Q1, the number of people shopping on Lens grew by over 10%, and the majority of Lens queries are incremental.
Sundar mentioned the significant growth we are also seeing with Circle to Search, as multimodality continues to drive queries across Search.
Moving to Ads. More businesses, big and small, are adopting AI-powered campaigns, and the deployment of AI across our Ads business is driving results for our customers and for our business.
Throughout 2024, we launched several features that leverage LLMs to enhance advertiser value, and we’re seeing this work pay off. The combination of these launches now allows us to match ads to more relevant search queries. And this helps advertisers reach customers in searches where we would not previously have shown their ads.
Focusing on our customers, we continue to solve advertisers’ pain points and find opportunities to help them create, distribute and measure more performant ads; infusing AI at every step of the marketing process.
On Audience insights, we released new asset-audience recommendations which tell businesses the themes that resonate most with their top audiences.
On creatives, advertisers can now generate a broader variety of lifestyle imagery, customized to their business to better engage their customers, and use them across PMax, Demand Gen, Display, and Apps campaigns. Additionally in PMax, advertisers can automatically source images from their landing pages and crop them, increasing the variety of their assets.
On media buying, advertisers continue to see how AI-powered campaigns help them find new customers.
In Demand Gen, advertisers can more precisely manage ad placements across YouTube, Gmail, Discover and Google Display Network globally, and understand which assets work best at a channel level.
Thanks to dozens of AI-powered improvements launched in 2024, businesses using Demand Gen now see an average 26% year-on-year increase in conversions per dollar spent for goals like purchases and leads. And when using Demand Gen with product feed, on average they see more than double the conversion per dollar spent year-over-year.
As an example, Royal Canin combined Demand Gen and PMax campaigns to find more customers for its cat and dog food products. The integration resulted in a 2.7 times higher conversion rate, a 70% lower cost per acquisition for purchasers, and increased the value per user by 8%.
Turning to YouTube, where we saw strong growth in revenues across Ads and Subscriptions. This week, we’re celebrating YouTube’s 20th anniversary. We’re proud of its leadership as a streaming destination where people come to watch everything they love; from live sports and creator-produced content, to Shorts and podcasts.
Creators are what drives viewership, and, on average, they upload 20 million videos a day to YouTube.
Our biggest creators generate a level of fandom and viewer engagement around large cultural moments on YouTube that brands can’t find anywhere else. During March Madness, brands aligned not only with clips and highlights from the game, but also with the creators who drive basketball culture, like Jesser and The Ringer’s J. Kyle Mann. In Q1, the growth of our reservation-based ads business more than doubled year-over-year.
Brands and creators continue to use the opportunities that collaborations and partnerships offer. Toyota worked with Zach King, the king of short magical videos with over 42 million followers, to take over his channel. The Creator Takeover, and accompanying Creator Ad, lifted Toyota’s brand awareness by 25%, compared to a control group, and 9% compared to the Toyota brand ad.
Looking at Shorts, engaged views grew by over 20% in the first quarter. We continue to be pleased with the progress we are making globally in Shorts monetization, relative to in-stream viewing, and are particularly encouraged by the trend in the U.S.
As always, I’ll wrap up with the strong momentum we are seeing in Partnerships, where our customers increasingly recognize the strength and breadth of what Google has to offer.
For instance, Roblox is partnering with Google Ad Manager to bring immersive ads to gamers. Gen Z gamers are Roblox’ biggest users, and thanks to our partnership, advertisers will be able to reach this audience with ads that blend seamlessly into the gaming experience.
We also launched a YouTube Shorts Effect to help people release iconic Roblox heads and inspire fans to create content at scale.
In closing, I’d like to thank Googlers everywhere for their contributions and commitment to our success, and to our customers and partners for their continued trust.
Anat, over to you.
Anat Ashkenazi, SVP and CFO, Alphabet and Google: Thank you, Philipp.
My comments will focus on year-over-year comparisons for the first quarter, unless I state otherwise.
I will start with results at the Alphabet level and will then cover our segment results. I’ll end with some commentary on our outlook for the second quarter and 2025.
We had another strong quarter in Q1. Consolidated revenues of $90.2 billion increased by 12%, or 14% in constant currency. Search and YouTube advertising; Subscriptions, Platforms and Devices; and Google Cloud each had double-digit revenue growth this quarter, reflecting strong momentum across the business.
Total Cost of Revenues was $36.4 billion, up 8%.
TAC was $13.7 billion, up 6%. We continue to see a revenue mix shift with Google Search growth at double-digit levels, while Network revenues, which have a much higher TAC rate, declined.
Other Cost of Revenues was $22.6 billion, up 9%, with the increase primarily driven by content acquisition costs, largely for YouTube, followed by depreciation and other technical infrastructure operations costs.
Total operating expenses increased 9% to $23.3 billion.
R&D investments increased by 14%, primarily driven by increases in compensation and depreciation expenses.
Sales and Marketing expenses decreased 4%, primarily reflecting a decline in compensation expenses.
G&A expenses increased by 17%, reflecting the impact of charges for legal and other matters.
Operating income increased 20% this quarter to $31 billion. And operating margin increased to 33.9%, representing 2.3 points of margin expansion. Operating margin benefited from healthy revenue growth, a moderated pace of compensation growth, and a favorable mix shift towards lower TAC advertising revenues; partially offset by a year-on-year increase in depreciation expenses of just over $1 billion.
Other income and expenses was $11.2 billion, primarily due to an unrealized gain on our non-marketable equity securities related to our investment in a private company, which we noted in our 10-K as a subsequent event.
Net income increased 46% to $34.5 billion. And Earnings per share increased 49% to $2.81.
We delivered Free Cash Flow of $19 billion in the first quarter, and $74.9 billion for the trailing twelve months. We ended the quarter with $95 billion in cash and marketable securities.
Turning to segment results, Google Services revenues increased 10% to $77.3 billion, reflecting strength in Google Search and YouTube advertising and subscriptions.
Google Search and Other advertising revenues increased by 10% to $50.7 billion.The robust performance of Search was, once again broad-based across verticals, led by Financial Services due primarily to strength in Insurance, followed by Retail.
YouTube advertising revenues increased 10% to $8.9 billion, driven by direct response advertising, followed by brand.
Network advertising [revenues] of $7.3 billion were down 2%.
Subscriptions, Platforms and Devices revenues increased 19% to $10.4 billion, primarily reflecting growth in subscription revenues. This growth was primarily driven by YouTube subscription offerings, followed by Google One, with growth in the number of subscribers being the biggest driver of revenue growth.
Google Services operating income increased 17% to $32.7 billion, and operating margin increased from 39.6% to 42.3%.
Turning to the Google Cloud segment, which continued to deliver very strong results this quarter, revenue increased by 28% to $12.3 billion in the first quarter, reflecting growth in GCP across Core and AI products at a rate that was much higher than Cloud’s overall revenue growth rate.
Growth in Google Workspace was primarily driven by an increase in average revenue per seat.
Google Cloud operating income increased to $2.2 billion, and operating margin increased from 9.4% to 17.8%. As we scale our fleet, we continue to focus on driving improvements in productivity, efficiency, and utilization to offset the growth in expenses, particularly from higher depreciation.
As to Other Bets, for the first quarter, revenues were $450 million, and operating loss was $1.2 billion. The year-on-year decline in revenue and increase in operating loss primarily reflect the milestone payment received in the first quarter of 2024 for one of our Other Bets.
With respect to CapEx, our reported CapEx in the first quarter was $17.2 billion, primarily reflecting investment in our technical infrastructure, with the largest component being investment in servers, followed by data centers, to support the growth of our business across Google Services, Google Cloud, and Google DeepMind.
In Q1, we returned value to shareholders in the form of $15.1 billion in share repurchases and $2.4 billion in dividend payments.
As we announced today, our Board of Directors declared a 5% increase in our quarterly dividend, and also approved a new $70 billion share repurchase authorization.
Turning to our outlook, I would like to provide some commentary on several factors that will impact our business performance in the second quarter and the remainder of 2025.
First, in terms of revenue, I’ll highlight a couple of items that we mentioned last quarter that will have an impact on second quarter and 2025 revenue.
First, in Google Services, advertising revenue in 2025 will be impacted by lapping the strength we experienced in the Financial Services vertical throughout 2024.
Second, in Cloud, we’re in a tight demand/supply environment. And given that revenues are correlated with the timing of deployment of new capacity, we could see variability in Cloud revenue growth rates, depending on capacity deployment each quarter. We expect relatively higher capacity deployment towards the end of 2025.
Moving to investments, starting with our expectation for CapEx for the full year 2025. We still expect to invest approximately $75 billion in CapEx this year. The expected CapEx investment level may fluctuate from quarter to quarter, due to the impact of changes in the timing of deliveries and construction schedules.
In terms of expenses, first, as I mentioned on our previous earnings call, the significant increase in our investments in CapEx over the past few years will continue to put pressure on the P&L, primarily in the form of higher depreciation.
In the first quarter, we saw 31% year-on-year growth in depreciation from the increase in technical infrastructure assets placed in service. Given the increase in CapEx investments over the past few years, we expect the growth rate in depreciation to accelerate throughout 2025.
Second, as we’ve previously said, we expect some headcount growth in 2025 in key investment areas. As we’ve disclosed previously, due to a shift in timing of our annual employee stock-based compensation award beginning in 2023, our first quarter stock-based comp expense is relatively lower compared to the remaining quarters of the year.
In conclusion, as you heard from Sundar and Philipp, we’re pleased with the progress we’re making across the organization, the results for the quarter and the opportunities ahead.
Our success as a company is grounded in our experience driving advancements in deep computer science that enables us to create innovative new products and services for users, businesses and partners around the world.
We have a strong track record of incubating and then building these offerings into new profitable businesses for Alphabet.
As we announced last quarter, YouTube and Cloud exited 2024 at a combined annual run rate of $110 billion.
And as you heard from Sundar earlier, Waymo is continuing to progress in building on its impressive technological achievements to scale rapidly and develop a sustainable business model.
Thank you. Sundar, Philipp, and I will now take your questions.
Operator: Thank you. As a reminder, to ask a question, you will need to press *1 on your telephone. To prevent any background noise, we ask that you please mute your line once your question has been stated.
And our first question comes from Brian Nowak from Morgan Stanley. Your line is now open.
Brian Nowak (Morgan Stanley): Great. Thanks for taking my questions. I have two.
The first one is sort of the macro advertising backdrop. Anat, I know it’s April 24th, and you call out some factors thinking about the second quarter. Any other factors you’re seeing in advertising verticals or regions or categories that could be showing any signs of weakness quarter-to-date we should think through? Any other changes from typical seasonality in 2Q25 versus prior quarters?
And the second one, Philipp, I think I heard you mention how the volume of commercial queries has increased. Maybe can you just walk us through which of the products are driving that increase in commercial queries? And as you sort of think about the pipeline of Search products, are there any others that you’re particularly excited about to continue to drive further commercial query growth throughout ’25-'26? Thanks.
Philipp Schindler, SVP and CBO, Google: So let me take the first one as well. We saw broad-based strength across ad verticals in Q1, and we saw it -- I’ll give you a bit of vertical color here.
Search was led again by Finance, due primarily to ongoing strength in Insurance. Retail, Healthcare, and Travel were actually also sizable contributors here to growth.
With regard to Q2, we’re only a few weeks in, so it’s really too early to comment. I mean, we’re obviously not immune to the macro environment, but we wouldn’t want to speculate about potential impacts beyond noting that the changes to the de minimis exemption will obviously cause a slight headwind to our ads business in 2025, primarily from APAC-based retailers.
And maybe to zoom out, I would say we have a lot of experience in managing through uncertain times, and we focus on helping our customers by providing deep insights into changing consumer behavior that is relevant to their business. Examples are auction dynamics, query trend insights on topics like replacement purchases and so on. So, we have a lot of experience in this area.
On the commercial query side, look, AI Overviews continue to drive higher satisfaction and Search usage. And as I noted, Q1 was really our largest expansion to date for AI Overviews, both in terms of launching to new users and providing responses for more questions. That’s really the core already of the answer, that AI Overviews sits at the center of your question here.
And when it comes to other products, look, I don’t want to speculate on this, but we’re happy with what we’re seeing here on AI Overviews. And I’m confident we can expand this to more products over time.
Operator: Thank you. Your next question is from Doug Anmuth from JPMorgan. Your line is now open.
Doug Anmuth (JPMorgan): Thanks for taking my questions.
Philipp, maybe just to go back to AI Overviews for a moment. Can you just tell us how we should think about the 1.5 billion AI Overviews users, just in terms of breadth of rollout? I know you’re saying monetization at approximately the same rate. But what does that mean in terms of click-through rates and conversion?
And then, Anat, just curious if there have been any changes to Google’s approach to durably re-engineering the cost base since you’ve joined? And if macro weakens and you see more of a slowdown, would you expect to find additional opportunities to cut back more on costs? Thank you.
Philipp Schindler, SVP and CBO, Google: Look, on the ads in AI Overviews. Late last year actually, we launched them within the AI Overviews on mobile in the U.S. and this builds on our previous rollout of ads above and below, so this was a change we have.
As I talked about it before, for AI Overviews overall, we see the monetization at approximately the same rate, which gives us a strong base on which we can innovate even more, so I’m very happy with this. I don’t think this is the moment to go into the details of click-through rates and conversion and so on.
But overall, we’re happy with what we’re seeing.
Anat Ashkenazi, SVP and CFO, Alphabet and Google: And to your question on our approach to productivity and efficiency, it hasn’t really changed. I’ve mentioned my approach, and our approach as a company, at the end of 2024, and we’re still focused on driving efficiency and productivity throughout the organization, both in our operating expenses and in our CapEx. I’ve mentioned some of these during my prepared remarks.
But certainly, this helps us as we think about the investments we need to make in innovation to drive long-term sustainable growth profile for the company, we’re able to repurpose some of these efficiencies into these investments.
As well as, as you think about the increase in CapEx we’ve seen over the past several years, and what we’re investing this year, this will put additional pressure on the income statement in the form of depreciation, so we’re working hard to try and offset some of the headwinds.
As well as within the CapEx investments themselves, $75 billion, we’re looking at how do we make sure every dollar is used efficiently. We have a highly rigorous process to determine the demand behind it, and then the allocation of the compute associated with our technical infrastructure investments, ensuring that we’re utilizing that appropriately and that we’re highly efficient with everything we’re doing.
You’ve seen some of the announcements and some of the changes, but we’re focusing on continuing to moderate the pace of compensation growth, looking at our real estate footprint, and again the buildout and utilization of our technical infrastructure across the business.
Doug Anmuth (JPMorgan): Great. Thank you both.
Operator: Thank you. Our next question comes from Eric Sheridan from Goldman Sachs. Your line is now open.
Eric Sheridan (Goldman Sachs): Thank you for taking the questions.
First, maybe for Sundar. When you look across the consumer AI landscape today, how are you thinking about continuing to drive differentiation for Gemini as a platform through the lens of usage, utility, or putting product innovation at the forefront of driving consumer habits?
And then the second one, maybe for Anat. If the macro environment were to change and become more downwardly volatile, how should investors think about the investments that are must-make this year, or almost fixed in nature, versus where there might be more flexibility to alter the investment priorities of the company if the macro environment were to worsen? Thank you so much.
Sundar Pichai, CEO, Alphabet and Google: Thanks, Eric.
Obviously, it’s an exciting moment on the AI front. I think the foundation for everything is obviously the frontier model progress we are seeing, particularly with 2.5 Pro and Flash, I think we’re well-positioned. We are seeing tremendous reception from developers, enterprises, and consumers, too.
And obviously, we are delivering consumer AI experiences across our product portfolio, including the primary way people experience it is obviously in Search with AI Overviews and very early days with AI Mode, but that will be a consumer AI-forward experience, and we are already seeing very positive feedback. People are typing in roughly 2X longer queries compared to traditional Search. So there’s a lot of excitement there.
And in the Gemini app, which you asked about, we’ve really seen increased momentum, particularly over the last few weeks as we have rolled out not just the newer models, but we’re seeing users are really responding well to all the innovation. Gemini Live, which is based on Project Astra, has been very well-received. DeepResearch, based on 2.5 Pro, is SOTA, and that’s been well-received. And Canvas- we have had a lot of traction as well.
And so, we are definitely investing more. We have recently organized ourselves better to capitalize on this momentum, and I’m excited about our roadmap there.
Anat Ashkenazi, SVP and CFO, Alphabet and Google: And on the investments this year and overall should there be any macroeconomic changes. As I said, we’re still planning to invest approximately $75 billion in CapEx this year. We do see a tremendous opportunity ahead of us across the organization, whether it’s to support Google Services, Google Cloud and Google DeepMind.
Recall I’ve stated on the Q4 call that we exited the year in Cloud specifically, with more customer demand than we had capacity. And that was the case this quarter as well. So, we want to make sure we ramp up to support customer needs and customer demands.
Having said that, we’re investing in long term, and we’re investing in innovation. That’s the essence of our business, and we want to do it in a responsible fashion.
So, you’ve seen us over the past couple years, and we’re continuing to do this and you’re seeing this in our results, drive efficiency and productivity throughout the business.
We’ve announced things such as consolidation of teams, which helps not just with cost but with velocity and speed, we’re able to get things to market faster, so that’s one of the areas we’re focused on. You heard from Sundar the last couple of calls on just the rapid pace of innovation we’re bringing to the marketplace.
So, the way we’re doing this across the business to drive productivity and efficiency should help us have a more resilient organization, irrespective of macroeconomic conditions.
But certainly, we don’t ignore that. We always look at what’s happening outside the organization as well as inside, but invest appropriately to drive both the short-term growth as well as the long-term growth.
Eric Sheridan (Goldman Sachs): Thank you.
Operator: Thank you. Our next question comes from Ross Sandler from Barclays. Your line is now open.
Ross Sandler (Barclays): Great, thanks. One for Sundar; one for Philipp.
Sundar, it was disclosed this week in the trial that’s going on that Gemini has 35 million DAUs. And just curious, that number obviously trails ChatGPT by a pretty wide margin. Could you just talk about the strategy to get that DAU figure much higher that you guys are deploying?
And then, Philipp, just curious to hear what you’re seeing on the brand advertising side at YouTube in 1Q and into early 2Q. Are brands holding up relatively well, like direct response, or are they starting to react to some of these macro jitters that we’re all experiencing? Any thoughts there? Thank you very much.
Sundar Pichai, CEO, Alphabet and Google: Thanks, Ross. I think I touched upon this to Eric’s question as well. But we are definitely, I think, there’s been a lot of momentum in terms of product features we’ve been introducing. And we are definitely seeing reception, including increased adoption and usage, based on those features. So, I think we are in a good, positive cycle.
The recent advances on the model frontier, by many metrics, I think we have the best model out there now, and I think that’s going to drive increased adoption as well.
And, again, I would reiterate, people are using, obviously we have 1.5 billion users through AI Overviews interacting with AI in a deep way, in a very engaged way.
Obviously, we are innovating with AI Mode. And we have a very exciting roadmap ahead with the Gemini app as well. So, across the board, super, super excited about what’s ahead.
Philipp Schindler, SVP and CBO, Google: And on your brand question, look, brand, and by the way, also direct response had very solid growth in Q1.
Brand advertisers really enjoyed cultural moments we had, like Coachella, for example, or March Madness. We had strong contributions overall from Finance and Retail verticals in Q1 on this side as well.
The operating metrics for YouTube were strong in Q1. Watch time growth remains robust, particularly in key monetization opportunity areas, such as Shorts and Living Room.
It’s also, by the way, nice to see the strong position of our creators who obviously benefit from the brand piece, which gives us a lot of confidence when we look at it more closely.
And on the Q2 side, I think I mentioned it’s too early to really comment on that.
Operator: Thank you. Your next question is from Mark Shmulik from Bernstein. Your line is now open.
Mark Shmulik (Bernstein): Great. Thanks for taking my questions.
Sundar, appreciate the color on Gemini deployment across kind of the 15 products with half a billion users or more. It would be great to hear more about where you’re seeing the most usage and deployment of GenAI internally at Google. Perhaps, what are the capabilities? Are they in a place today in terms of either supplementing or augmenting the workforce?
And then just to build on that earlier AI Mode-type questions, appreciate that AI Mode has 2X longer queries than traditional Search. But any color you can share, perhaps on how AI Mode’s behavior differs from how consumers are using the Gemini app? Thank you.
Sundar Pichai, CEO, Alphabet and Google: Look, internally, this has been an extraordinary amount of focus and excitement. Both because I think the early use cases have been transformative in nature and I think it still feels like early days, a long ways to go.
Obviously, I had mentioned a few months ago, in terms of how we are using AI for coding, we are continuing to make a lot of progress there in terms of people using coding suggestions. I think, the last time I said a number, it was, like, 25% of code that’s checked in involves people accepting AI-suggested solutions. That number is well over 30% now.
But more importantly, we have deployed more deeper flows and particularly with the newer models, I think we are working on early agentic workflows and how we can get those coding experiences to be much deeper.
We are deploying it across all parts of the company. Our customer service teams are deeply leading the way there. We have both dramatically enhanced our user experience, as well as made it much more efficient to do so. And we are actually bringing all our learnings and expertise in our solutions, through Cloud, to our other customers.
But beyond that, all the way from the Finance team preparing for this earnings call, to everything, it’s deeply embedded in everything we do. But I still see it as early days, and there’s going to be a lot more to do.
On AI Mode, look, I think we are just leaning in on the early positive feedback as we scale up AI Overviews. It’s been one of our most positive launches, but it’s been clear people have wanted even more of it.
And so with AI Mode, we are bringing our state-of-the-art Gemini models right into Search. I mentioned people typing longer queries, there’s a lot more complex, nuanced questions. People are following through more. People are appreciating the clean design, the fast response time, and the fact that they can kind of be much more open-ended, can undertake more complicated tasks. Product comparisons, for example, has been a positive one, exploring how-to’s, planning a trip.
So, those are the kinds of early feedback we are seeing, and I think we are, obviously, really focused on improving the product across all of AI Mode, AI Overviews, and the Gemini app. And we are seeing positive user traction as well.
Operator: Thank you. Your next question is from Mark Mahaney from Evercore. Your line is now open.
Mark Mahaney (Evercore): Okay, thanks. One for Anat and one for Sundar.
Anat, getting back to a question I think that Doug was asking earlier, you just put up record-high or multiyear record-high margins for both Google Services and for Google Cloud.
You talked, though, about depreciation expenses accelerating rapidly throughout the year because of all the investments you’ve warned people about.
Back in the September quarter, you seemed relatively confident that you had enough levers to kind of offset kind of rising infrastructure costs. Was that still your -- six months later, is that still your view that you’ve got enough levers, that even with the rising infrastructure costs, there’s enough in there to kind of counterbalance that?
And then just briefly on Waymo, continues to rise aggressively, the numbers, Sundar. The long-term business model for Waymo, is there a reason to make a decision on that soon? Or have you already made the decision on whether this is a long-term licensing model, or you really want to run this as a standalone ride-sharing, delivery, autonomous vehicle business? Thank you very much.
Anat Ashkenazi, SVP and CFO, Alphabet and Google: Thanks.
On your first question on profitability and what levers do we have and do we still have levers to pull. First, I think every organization can always push a little further. I don’t view productivity goals or efficiency as an episodic, kind of project-based effort but rather a continuous effort, that when you get to a certain place you push a little further.
Having said that, we do have significant investments we’re making across the organization, and we have been making them for the past several quarters. And we’ve been able to do it because we were able to find efficiency to fund those investments across the organization, those are for products and services that are going to drive long-term growth for the company.
So, while we’re trying to offset as much of the headwind associated with the increase in infrastructure costs, it will become more difficult. As I said, the depreciation will accelerate. We had about a 31% year-over-year growth in depreciation this quarter, and it will be higher as we go throughout the year. So, think about that kind of as a headwind that we have to manage against.
But we’re continuing with pushing across the organization, leveraging, Sundar mentioned, the use of AI, of an AI-first Google, across several of our functions, to help us manage a larger scope of work using our AI, AI agents, and AI tools. As Sundar mentioned, we did leverage it in preparation for the earnings call, and we’re leveraging across several functions.
So, there are opportunities, but there are also great opportunities for investment. And we want to make sure that we make room for – to make these investments to drive long-term growth and ensure we have a very resilient, long-term growth profile for the company.
Sundar Pichai, CEO, Alphabet and Google: And, Mark, thanks. I think this is probably the first question I’ve got on an earnings call on Waymo, so thank you and I think it’s a sign of its progress.
Look, the thing that excites me is I think we’ve been laser-focused, and we’ll continue to be, on building the world’s best driver. And I think doing that well really gives you a variety of optionality and business models across geographies, et cetera.
It will also require a successful ecosystem of partners, and we can’t possibly do it all ourselves. And so I’m excited about the progress the teams have made through a variety of partnerships.
Obviously, a highlight of it is our partnership with Uber. We are very pleased with what we are already seeing in Austin in terms of rider satisfaction. We look forward to offering the first paid rides in Atlanta, via Uber, later this year.
But we are also building up a network of partners. For example, for maintaining fleets of vehicles and doing all the operations related to that, with the recently announced partnership with Moove in Phoenix and Miami. Obviously partnerships with OEMs. There are future optionality around personal ownership as well.
So, we are widely exploring, but at the same time, clearly staying focused and making progress, both in terms of safety, the driver experience, and progress on the business model and operationally scaling it up.
Mark Mahaney (Evercore): Thank you very much.
Operator: Thank you. Our next question comes from Ken Gawrelski from Wells Fargo. Your line is now open.
Ken Gawrelski (Wells Fargo): Thank you. Two, if I may please.
First, on AI-powered Search. You have a number of AI-powered search interfaces, including three most prominently: AI Overviews, AI Mode, and Gemini.
In the future, should we think of these as distinct experiences that will be long-lasting, or more experimental now and Google will eventually focus on one approach going forward?
And the second one is more on the financial side. You continue to experience very healthy gross-margin expansion. We see the TAC, sure. But, Anat, you also talked about the offsetting depreciation expense.
Could you talk about, beyond those two buckets, where you’re seeing the real savings on the COGS line in driving that gross-margin expansion? And maybe even how we should be thinking about that going forward? Thank you.
Sundar Pichai, CEO, Alphabet and Google: Ken, maybe on AI-powered search and how do we see our consumer experiences - look, I do think Search and Gemini, obviously, will be two distinct efforts, right. I think there are obviously some areas of overlap, but they also expose very, very different use cases.
And so, for example, in Gemini, we see people iteratively coding and going much deeper on a coding workflow as an example. So, I think both will be around.
Within Search, we think of AI Overviews as scaling up and working for our entire user base, but in AI Mode is the tip of the tree for us pushing forward on an AI-forward experience.
There will be things which we discover there, which will make sense in the context of AI Overviews, so I think will flow through to our user base. But you almost want to think of: “What are the most advanced one million people using Search for? The most advanced 10 million people? And then how do a billion and a half people use Search for?” We want to innovate, and so I think this allows us to do that.
But the true north through all of this is user feedback, user satisfaction, user experience. And so, that will determine where this all works out in the future.
Anat Ashkenazi, SVP and CFO, Alphabet and Google: And to your question on gross margin, a couple of trends to highlight there. And I’ve mentioned this in the prepared remarks. You’ve seen improvement in TAC that’s really driven by the change in revenue mix, with continued Search growth and then Network revenue declines. Network revenue has a much higher TAC rate, so that mix is helping us from a gross-margin perspective. So, think about that as well.
Now, we do have depreciation for technical infrastructure, hits primarily in two places in the income statement. One is in Other Cost of Sales, and the rest is in R&D. So, it is in that line item that’s impacting cost of sales.
Now, we’ve had some efficiencies there. And I did mention the improvement in our overall – cost of – headcount growth and compensation kind of moderating those growth. So, that helps us as well more than offset the depreciation increases in Q1.
But as I’ve mentioned, this number will be higher in the coming quarters. Recall, we said approximately $75 billion in CapEx, which is up from $55 billion, or just over $50 billion last year. So, there is expected to be quite a significant increase in depreciation.
Operator: Thank you. And our last question comes from Ron Josey from Citi. Your line is now open.
Ron Josey (Citi): Great. Thanks for taking the question.
Philipp, I wanted to touch a little bit more on your comments around direct response and YouTube. I think it’s been improving and been a driver over the past couple quarters. I’d love to hear more just about what’s driving that. Is that the Demand Gen and integration with PMax? Or are users perhaps more involved on direct response now that Shorts usage is rising? Would love your thoughts there. Thank you.
Philipp Schindler, SVP and CBO, Google: Yeah, I think there’s a lot of different factors. Mostly we continue to help our customers really using our AI-powered tools, you mentioned a few of them,to drive performance. That’s a very big one.
As I mentioned before, we’re also happy with the progress we’re seeing on Shorts and closing the monetization gap here to the overall business, which is actually really nice to see, especially in the U.S. So, we’re very happy with that, yeah.
Sundar Pichai, CEO, Alphabet and Google: And I’ll just chime in to say, YouTube just celebrated its 20th birthday and we now have more than 20 billion videos on YouTube, and we get 20 million videos uploaded every day.
So, I think it’s a tremendous platform, and thanks to all the creators and users who have supported us there over the years.
Ron Josey (Citi): Great, thank you.
Operator: Thank you. And that concludes our question-and-answer session for today.
I would like to turn the conference back over to Jim Friedland for any further remarks.
Jim Friedland, Senior Director, Investor Relations: Thanks, everyone, for joining us today. We look forward to speaking with you again on our second quarter 2025 call. Thank you, and have a good evening.
Operator: Thank you, everyone. This concludes today’s conference call. Thank you for participating. You may now disconnect.
Privacy(opens in new window)
Terms(opens in new window)
About Google(opens in new window)
Google Products(opens in new window)