Skip to main content

Adobe marks 30 years of Photoshop with new iPad and Mac features

On February 19, 1990, Adobe released Photoshop 1.0 on Mac System 6. 30 years later, it’s impossible to overstate how transformative the image editing software has been to all of our lives. To mark today’s milestone, Adobe is looking to the future by adding new features to Photoshop on the iPad, and celebrating its legacy with updates for the Mac.

Since its initial release last November, Adobe has worked quickly to add missing desktop features to Photoshop on the iPad and introduce new functionality in areas where the touch interface shines. The app has now surpassed 1 million downloads and 2.8 million cloud documents. The most significant addition today is the Object Selection tool, first released on the desktop three months ago.

The Object Selection tool uses machine learning to make selections quick and accurate, especially when images have multiple objects. Rough selections are automatically refined based on the content in your image. The same functionality and settings from macOS are present on the iPad version. A Refine Edge brush will ship in the future.

Type layer, character, and option properties have been added. Adobe says this includes common typographic controls like tracking, leading, scaling, and formatting of text. Kerning support is promised in a future release.

Cloud document upload and download speeds have been improved for all PSDs 10MB or larger. Depending on file size and network performance, Adobe claims up to 90% faster uploads and downloads are possible.

On the Mac, Photoshop will support Dark Mode in macOS for system dialogs. The Content-Aware Fill workspace has also been improved. Multiple selections and fills can be made without leaving and re-entering the workspace each time. This streamlined workflow was a top request of Photoshop users.

Output quality and performance of the Lens Blur effect has been improved thanks to GPU processing. Users should see improved realism, sharpness, and more colorful bokeh. Adobe explained the algorithm powering Lens Blur in detail:

The results are created by an algorithm the team built by studying the first principles of physics and how light interacts with objects in the real world. It is carefully tuned to simulate a 3D environment to create the most realistic results possible, while also consuming the least amount of computer power so you don’t burn up your machine. Lots of research and iteration occurred to make the feature. Several PhDs were involved. And now you can synthetically adjust the depth of field by dynamically manipulating the blur of a 2D image after capture in milliseconds.

You should see today’s Photoshop updates on the desktop and iPad rolling out today if you’re a Creative Cloud subscriber. For even more information, read Adobe’s blog post. Adobe is encouraging fans to share their 30th anniversary Photoshop memories with the hashtag #PsILoveYou30. 

Lead image by Kelly Castro

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Michael Steeber Michael Steeber

Michael is a Creative Editor who covered Apple Retail and design on 9to5Mac. His stories highlighted the work of talented artists, designers, and customers through a unique lens of architecture, creativity, and community.

Contact Michael on Twitter to share Apple Retail, design, and history stories: @MichaelSteeber