Connect with us

Defence Industry

Chief of Navy closes Sea Power Conference

Chief of Navy, Vice Admiral Tim Barrett, AO, CSC, RAN, spoke at the closing ceremony of the 2017 SeaPower conference

D+I Newsroom

Published

on

chief-of-navy

Last week the Royal Australian Navy hosted the biennial Sea Power Conference, with senior naval delegations from around the world descending on Sydney for three days of discussions and Navy-to-Navy engagements.

Run alongside the Pacific 2017 international maritime exposition, Sea Power Conference is Navy’s premier gathering of naval chiefs and this year will explore the broad theme of ‘The Navy and the Nation’, focusing on maritime identity, the significance of maritime economics and use of oceans.

Chief of Navy, Vice Admiral Tim Barrett, AO, CSC, RAN, spoke at the closing ceremony of the 2017 SeaPower conference while onboard the recently commissioned Air Warfare Destroyer HMAS Hobart.

Defence.com.au and D + I Magazine has a dedicated news desk within the 24/7 Cyber Newsroom. We're now able to publish news as it comes to hand, anytime day or night.

Continue Reading

Navy

Future Frigate capability described by Chief of Navy

SEA 5000 Phase 1 Announcement

D+I Newsroom

Published

on

future-frigate
Photo: Department of Defence

Chief of Navy, Vice Admiral Tim Barrett, AO, CSC, RAN, describes the capability of the Future Frigate to members of Air Warfare Destroyer, NUSHIP Brisbane’s, ship’s company at Osborne in South Australia.

On 29 June 2018, BAE Systems Australia were announced as the successful bid to design the Global Combat Ship – Australia Hunter Class frigates, to be built by ASC Shipbuilding at the Osborne Naval Shipyard in Adelaide, South Australia.

SEA 5000 Phase 1 Future Frigate Program will deliver anti-submarine warfare frigates,the Hunter class.

The Hunter class enter service in the late 2020s replacing the eight Anzac frigates, which have been in service since 1996.

The Hunter class will have the capability to conduct a variety of missions independently, or as part of a task group, with sufficient range and endurance to operate effectively throughout the region.

The frigates will also have the flexibility to support non-warfare roles such as humanitarian assistance and disaster relief.

Incorporating the leading-edge Australian-developed CEA Phased-Array Radar and the US Navy’s Aegis combat management system, with an Australian interface developed by Saab Australia, the Hunter class will be one of the most capable warships in the world.

Continue Reading

Army

Army’s Hawkei demonstrates Operational Intelligence

The CASG Land 121 Phase 4 project team has put the Hawkei protected vehicle’s new C4I Integral Computing System (ICS) to the test.

Terry Turner

Published

on

hawkei

The Capability Acquisition and Sustainment Group’s (CASG) Land 121 Phase 4 project team has put the Hawkei protected vehicle’s new C4I Integral Computing System (ICS) to the test, during a demonstration of the deployable Protected Mobility Vehicle – Light (PMV-L) capability at the Monegeetta Proving Ground.

During the activity a Project Charter for the ICS was signed, guiding the interaction and collaboration between Defence and the nine contractors involved in delivery of the ICS; Thales Australia, Cablex, Elbit Land Systems Australia, Esterline, Harris Australia, Kongsberg, Raytheon, Rockwell Collins, and Thomas Global Systems.

Land 121 Phase 4 Project Director COL John McLean said that the ICS reflects the Australian Army’s requirement for a more integrated approach to C4I on vehicles that realises benefits in the areas of useability, space, weight and power.

“Using generic vehicle architecture (GVA) and a central computing concept to host various C4I systems and communications, the ICS will optimise and centralise the flow of information to the user, enabling rapid decision making and multitasking at levels not previously achievable on land based platforms,” COL McLean said.

“The new ICS will enable the vehicle operator to manage radios, sensors, the Battle Management System, and weapon systems – all through a common interface.

“The ICS was demonstrated to work successfully with Army’s Battle Management System and communications suite, as well as other features and systems of the deployable PMV-L capability.”

While the demonstration was a contractual requirement for Hawkei’s Bendigo-based manufacturer Thales, it also gave the invited Defence stakeholders an opportunity to see the deployable PMV-L’s various planned features, including:

  • Integral Computing System (ICS) Command Vehicle installation
  • Battle Management System (BMS) operating on Windows 10
  • Advanced Field Artillery Tactical Data System (AFATDS)
  • Digital Terminal Control Station (DTCS)
  • Interactive Electronic Technical Manuals (IETMs)
  • Force Protection Electronic Counter Measures (FPECM)
  • Rocket Propelled Grenade Cage
  • Remote Weapon Station
  • Manned Weapon Mount

Visitors were also able to experience the handling characteristics and performance of the Hawkei through an interactive patrol demonstration.

The initial baseline of the C4I ICS will be available on low-rate-initial-production vehicles from 2018.

Continue Reading

Technology

United Nations urged to ban lethal autonomous weapons

World’s top AI and robotics companies have urged the United Nations to ban lethal autonomous weapons – Killer Robots.

Terry Turner

Published

on

killer-robots

The World’s top AI and robotics companies have urged the United Nations to ban lethal autonomous weapons, also being referred to as Killer Robots.

Open letter by leaders of leading robotics & AI companies is launched at the world’s biggest artificial intelligence conference as UN delays meeting till later this year to discuss the robot arms race

An open letter signed by 116 founders of robotics and artificial intelligence companies from 26 countriesurges the United Nations to urgently address the challenge of lethal autonomous weapons (often called ‘killer robots’) and ban their use internationally.

A key organiser of the letter, Toby Walsh, Scientia Professor of Artificial Intelligence at the University of New South Wales in Sydney, released it at the opening of the International Joint Conference on Artificial Intelligence (IJCAI 2017) in Melbourne, the world’s pre-eminent gathering of top experts in artificial intelligence (AI) and robotics. Walsh is a member of the IJCAI 2017’s conference committee.

The open letter is the first time that AI and robotics companies have taken a joint stance on the issue. Previously, only a single company, Canada’s Clearpath Robotics, had formally called for a ban on lethal autonomous weapons.

In December 2016, 123 member nations of the UN’s Review Conference of the Convention on Conventional Weapons unanimously agreed to begin formal discussions on autonomous weapons. Of these, 19 have already called for an outright ban.

“Lethal autonomous weapons threaten to become the third revolution in warfare,” the letter states. “Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend.

“These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close,” it states, concluding with an urgent plea for the UN “to find a way to protect us all from these dangers.”

Signatories of the 2017 letter include:

  • Elon Musk, founder of Tesla, SpaceX and OpenAI (USA)
  • Mustafa Suleyman, founder and Head of Applied AI at Google’s DeepMind (UK)
  • Esben Østergaard, founder & CTO of Universal Robotics (Denmark)
  • Jerome Monceaux, founder of Aldebaran Robotics, makers of Nao and Pepper robots (France)
  • Jü rgen Schmidhuber, leading deep learning expert and founder of Nnaisense (Switzerland)
  • Yoshua Bengio, leading deep learning expert and founder of Element AI (Canada)

Their companies employ tens of thousands of researchers, roboticists and engineers, are worth billions of dollars and cover the globe from North to South, East to West: Australia, Canada, China, Czech Republic, Denmark, Estonia, Finland, France, Germany, Iceland, India, Ireland, Italy, Japan, Mexico, Netherlands, Norway, Poland, Russia, Singapore, South Africa, Spain, Switzerland, UK, United Arab Emirates and USA.

Walsh is one of the organisers of the 2017 letter, as well as an earlier letter released in 2015 at the IJCAI conference in Buenos Aires, which warned of the dangers of autonomous weapons. The 2015 letter was signed by thousands of researchers in AI and robotics working in universities and research labs around the world, and was endorsed by British physicist Stephen Hawking, Apple  Co-founder Steve Wozniak and cognitive scientist Noam Chomsky, among others.

“Nearly every technology can be used for good and bad, and artificial intelligence is no different,” said Walsh. “It can help tackle many of the pressing problems facing society today: inequality and poverty, the challenges posed by climate change and the ongoing global financial crisis. However, the same technology can also be used in autonomous weapons to industrialise war.

“We need to make decisions today choosing which of these futures we want. I strongly support the call by many humanitarian and other organisations for an UN ban on such weapons, similar to bans on chemical and other weapons,” he added.

“Two years ago at this same conference, we released an open letter signed by thousands of researchers working in AI and robotics calling for such a ban. This helped push this issue up the agenda at the United Nations and begin formal talks. I am hopeful that this new letter, adding the support of the AI and robotics industry, will add urgency to the discussions at the UN that should have started today.”

“The number of prominent companies and individuals who have signed this letter reinforces our warning that this is not a hypothetical scenario, but a very real, very pressing concern which needs immediate action,” said Ryan Gariepy, founder & CTO of Clearpath Robotics, who was the first to sign.

“We should not lose sight of the fact that, unlike other potential manifestations of AI which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability,” he added. “The development of lethal autonomous weapons systems is unwise, unethical and should be banned on an international scale.”

Yoshua Bengio, founder of Element AI and a leading ‘deep learning’ expert, said: “I signed the open letter because the use of AI in autonomous weapons hurts my sense of ethics, would be likely to lead to a very dangerous escalation, because it would hurt the further development of AI’s good applications, and because it is a matter that needs to be handled by the international community, similarly to what has been done in the past for some other morally wrong weapons (biological, chemical, nuclear).”

Stuart Russell, founder and Vice-President of Bayesian Logic, agreed: “Unless people want to see new weapons of mass destruction – in the form of vast swarms of lethal microdrones – spreading around the world, it’s imperative to step up and support the United Nations’ efforts to create a treaty banning lethal autonomous weapons. This is vital for national and international security.”

BACKGROUND

The International Joint Conference on Artificial Intelligence (IJCAI) is the world’s leading conference on artificial intelligence. It has been held every two years since 1969, and annually since 2015. It attracts around 2,000 of the best researchers working in AI from around the world. IJCAI 2017 is currently being held in Melbourne, Australia.

Two years ago, at IJCAI 2015, more than 1,000 AI researchers released an open letter calling for a ban on lethal autonomous weapons. Signatories to this letter have now grown to over 17,000.

As part of Melbourne’s Festival of Artificial Intelligence, there will be a public panel on Wednesday 23 August, 5.30 to 7.00pm, entitled, ‘Killer robots: The end of war?’. The panel features Stuart Russel, Ugo Pagallo and Toby Walsh. This is part of AI Lounge, a conversation about artificial intelligence open to the public and media every night from 21 to 25 August 2017 (see http://tinyurl.com/ailounge)

Toby Walsh’s new book, It’s Alive!: Artificial Intelligence from the Logic Piano to Killer Robots, just published by Black Inc, covers the arguments for and against lethal autonomous weapons in detail.

Continue Reading
Advertisement

New Facebook Page

Trending

Live Chat
1
Close chat
Get Defence + Industry Magazine by simply typing 'subscribe' and we'll send you a free download link for each new issue!

Subscribe Free