SW Squadrons New Republic Hangar
Image: Motive Studios/EA

Electronic Arts has released revised PC system requirements for its upcoming multiplayer space-combat game, Star Wars Squadrons. This move is not entirely unusual, as games continue in testing and development before the official release. The revised specs are primarily focused around changes for CPU and storage options. No changes as of yet have been made to the recommended GPU side of things. There are both VR and non-VR specifications. It should be noted that the Recommended (non-VR) / Minimum (VR) and Recommended (VR) specs are largely the same.

Recommended (non-VR) / Minimum (VR)

  • OS: Windows 10 64-bit
  • Processor (AMD): Ryzen 7 2700X
  • Processor (Intel): Intel i7-7700
  • Memory: 16GB
  • Graphics Card (AMD): Radeon RX 480 or Equivalent
  • Graphics Card (Nvidia): GeForce GTX 1060 or Equivalent
  • DirectX: 11.1
  • Multiplayer Online Connection Requirements: 512 Kbps or faster Internet connection
  • Hard Drive Space: 40GB

Recommended (VR)

  • OS: Windows 10 64-bit
  • Processor (AMD): Ryzen 7 2700X
  • Processor (Intel): Intel i7-7700
  • Memory: 16GB
  • Graphics Card (AMD): Radeon RX 570 or Equivalent
  • Graphics Card (Nvidia): GeForce GTX 1070 or Equivalent
  • DirectX: 11.1
  • Multiplayer Online Connection Requirements: 512 Kbps or faster Internet connection
  • Hard Drive Space: 40GB

Minimum (non-VR)

  • OS: Windows 10
  • Processor (AMD): Ryzen 3 1300X
  • Processor (Intel): Intel i5 6600k
  • Memory: 8GB
  • Graphics Card (AMD): Radeon HD 7850 or Equivalent
  • Graphics Card (Nvidia): GeForce GTX 660 or Equivalent
  • DirectX: 11.1
  • Multiplayer Online Connection Requirements: 512 Kbps or faster Internet connection
  • Hard Drive Space: 40GB

Star Wars Squadrons is set to release on October 2nd. It is currently listed for $39.99, but you can get it for $5 off through Origin. While there will be no microtransactions, pre-orders on Origin or Steam will include bonus cosmetic content.

Peter Brosdahl

As a child of the 70’s I was part of the many who became enthralled by the video arcade invasion of the 1980’s. Saving money from various odd jobs I purchased my first computer from a friend of my...

Join the Conversation

12 Comments

  1. Ryzen 2700x 16gb… I just meet the specs, Although I do plan an upgrade if the Ryzen 4000 desktop stuff actually comes out this year.
  2. Wow, seems to lean pretty heavily on the CPU recommendations. That, or the GPU specs are much lower than I would have thought.
  3. I thought the GPU specs were low myself. Maybe they’ll change them again at some point.
  4. Yeah, GPU seems low compared to the CPU requirements. My 4690k doesn’t make the cut, but my 1080 GTX does.
  5. Makes me think the game is multithread aware and more computational.. But… The 7700k is not a powerful cpu any more… I have a motherboard (Mai gaming g7 I think) with 7700k and 32gb of ram in it sitting on a shelf. Anyone want to buy it? Lol
  6. Makes me think the game is multithread aware and more computational.. But… The 7700k is not a powerful cpu any more… I have a motherboard (Mai gaming g7 I think) with 7700k and 32gb of ram in it sitting on a shelf. Anyone want to buy it? Lol

    No, not really a powerful CPU today, but it’s also not really that much better than… pretty much any i7-Series that came before it either.

  7. Makes me think the game is multithread aware and more computational.. But… The 7700k is not a powerful cpu any more… I have a motherboard (Mai gaming g7 I think) with 7700k and 32gb of ram in it sitting on a shelf. Anyone want to buy it? Lol

    No, not really a powerful CPU today, but it’s also not really that much better than… pretty much any i7-Series that came before it either.

    Not to totally sidetrack the thread but I agree on both fronts. It’s actually what got me off the Intel train. I built my first ‘i’ series PC with a 2600K gently OC’d to 4.2 GHz. That thing did the job all the way until 2018 or so for most games I played. I watched as each gen after it required a new socket, maybe 1 or 2 more features to the chip, and maybe, just maybe, squeezed 100 or 200 Mhz in them. It was utterly ridiculous to watch that pattern all the way to 8700K CPUs. BTW that rig is once again revived for work-at-home for my other job but has my old 1080 Ti in it. Still kicks butt for what it is.

  8. I’ve still got a i7 920 running a SQL server at work. Not claiming it will run games or compete with a 7700 – there was a big leap to Sandy, and then a whole lot of little small unexciting incremental changes, as Peter says
  9. I’ve still got a i7 920 running a SQL server at work. Not claiming it will run games or compete with a 7700 – there was a big leap to Sandy, and then a whole lot of little small unexciting incremental changes, as Peter says

    When I went from a 2600k to the 7700k it was a big speed bump for my bus speeds and such. BUT… it wasn’t a technology bump. Like when I got the 3900x. It isn’t a lot ‘faster’ at day to day tasks. But it NEVER bogs down. ;)

  10. When I went from a 2600k to the 7700k it was a big speed bump for my bus speeds and such. BUT… it wasn’t a technology bump. Like when I got the 3900x. It isn’t a lot ‘faster’ at day to day tasks. But it NEVER bogs down. ;)

    Yep, I replaced mine with the 3700x. At this rate, the 4930K will likely be replaced by a Ryzen 4000 or 5000 series. That one reminds of some oversized Plymouth from the late 60’s. Big, heavy, but once it gets going watch out.

  11. If it’s heavily multi-threaded I welcome that. Isn’t that what we’ve been pushing for? Use all these **** cores we pay for and pull some of the load off the GPU? Lately it’s a hell of a lot cheaper to upgrade mobo/CPU than it is to upgrade GPU.

    So, yeah, utilize all those cores and get us back to where a mid-range GPU is actually good!

  12. If it’s heavily multi-threaded I welcome that. Isn’t that what we’ve been pushing for? Use all these **** cores we pay for and pull some of the load off the GPU? Lately it’s a hell of a lot cheaper to upgrade mobo/CPU than it is to upgrade GPU.

    So, yeah, utilize all those cores and get us back to where a mid-range GPU is actually good!

    When I was researching this story I came across this on twitter. They’re making fun of Intel but I’d say it’s somewhat true of multithreading in general.

Leave a comment