Chandrayan-2 Mission

The Technology & Economic Forum is a venue to discuss issues pertaining to Technological and Economic developments in India. We request members to kindly stay within the mandate of this forum and keep their exchanges of views, on a civilised level, however vehemently any disagreement may be felt. All feedback regarding forum usage may be sent to the moderators using the Feedback Form or by clicking the Report Post Icon in any objectionable post for proper action. Please note that the views expressed by the Members and Moderators on these discussion boards are that of the individuals only and do not reflect the official policy or view of the Bharat-Rakshak.com Website. Copyright Violation is strictly prohibited and may result in revocation of your posting rights - please read the FAQ for full details. Users must also abide by the Forum Guidelines at all times.
la.khan
BRFite
Posts: 468
Joined: 15 Aug 2016 05:02

Re: Chandrayan-2 Mission

Post by la.khan »

chetak wrote:twitter

This zoom-in of the #Chandrayaan2 lander doppler observations show that the start of the fine braking phase seems to have gone according to plan (at 20:18:20 UTC), and something unexpected happened 15 seconds into the fine braking phase.
Source: https://github.com/tammojan/satellite_a ... ding.ipynb
Is that Github link pointing to snippets of source code of software that runs/controls CY2? It was coded in Python? Wow, just wow! :eek:
srin
BRF Oldie
Posts: 2522
Joined: 11 Aug 2016 06:13

Re: Chandrayan-2 Mission

Post by srin »

Mort Walker wrote:Time to close this thread as images aren’t going to come anytime soon. Vikram is finished and better to move along to failure analysis. Get testing done right whatever the cost. Next big event will have three Vyomnauts go into space for a week and return. The space capsule and all subsystems must be fail proof or people will die.
There is much more to Chandrayaan 2 than just the Vikram lander. It has dual band SAR, a very high res camera and other scientific payloads.

My request is to keep the thread open.
Mort Walker
BRF Oldie
Posts: 10040
Joined: 31 May 2004 11:31
Location: The rings around Uranus.

Re: Chandrayan-2 Mission

Post by Mort Walker »

srin wrote:
Mort Walker wrote:Time to close this thread as images aren’t going to come anytime soon. Vikram is finished and better to move along to failure analysis. Get testing done right whatever the cost. Next big event will have three Vyomnauts go into space for a week and return. The space capsule and all subsystems must be fail proof or people will die.
There is much more to Chandrayaan 2 than just the Vikram lander. It has dual band SAR, a very high res camera and other scientific payloads.

My request is to keep the thread open.
It will be very slow for information to come out and better to roll this thread into other ISRO discussions.
neerajb
BRFite
Posts: 853
Joined: 24 Jun 2008 14:18
Location: Delhi, India.

Re: Chandrayan-2 Mission

Post by neerajb »

la.khan wrote:
chetak wrote:twitter

Is that Github link pointing to snippets of source code of software that runs/controls CY2? It was coded in Python? Wow, just wow! :eek:
It is simply reading data from some raw files and generating the plots. Python is being used here for visualization and not to control CY2.

If you are looking towards Python for real time high performance then you are in for trouble. Python is good for batch type workload, has extensive support package wise and is easy/flexible to use.

Coming back to performance, C/C++ beats Python and java hands down. If you can code in assembly then nothing better than that. Extremely fast and compact.

Cheers....
la.khan
BRFite
Posts: 468
Joined: 15 Aug 2016 05:02

Re: Chandrayan-2 Mission

Post by la.khan »

neerajb wrote: It is simply reading data from some raw files and generating the plots. Python is being used here for visualization and not to control CY2.
...
Coming back to performance, C/C++ beats Python and java hands down. If you can code in assembly then nothing better than that. Extremely fast and compact.
I was expecting everything to be written in either assembly language or C. I know neither :oops: Was excited to see Python as I learnt a little. Thanks for the gyan!
prasannasimha
Forum Moderator
Posts: 1214
Joined: 15 Aug 2016 00:22

Re: Chandrayan-2 Mission

Post by prasannasimha »

The main mission whichis the orbiter is still going on and data as usual will be released in two tranches. First release is for ISRO and Indian researcherd. After 6 months data is released for any scientific use after registration with ISRO. Both MOM and ASTROSAT data are beingreleased in that format. Yhere is a reason it is being done that way. Lets not go into that further.
Ot is better to keep Chandrayaan 2 mission data here collated in one place.
SSSalvi
BRFite
Posts: 785
Joined: 23 Jan 2007 19:35
Location: Hyderabad

Re: Chandrayan-2 Mission

Post by SSSalvi »

^
^
After thorough testing & confirmation of logic in C etc, final coding is converted to manually written machine code embedded in Space qualified ROMs/PLAs.

^
Let the thread be live .. why lock?
ArjunPandit
BRF Oldie
Posts: 4056
Joined: 29 Mar 2017 06:37

Re: Chandrayan-2 Mission

Post by ArjunPandit »

neerajb wrote: Coming back to performance, C/C++ beats Python and java hands down. If you can code in assembly then nothing better than that. Extremely fast and compact.
execution speed is quite often just one aspect, quite often for rapid prototyping C/C++ ASM are a pain. in debugging....for coding and testing purposes i have seen Python being used more and more often. You never know if people might have used it for prototyping....i know DRDO/ISRO extensively use C++...surprised to see python
UlanBatori
BRF Oldie
Posts: 14045
Joined: 11 Aug 2016 06:14

Re: Chandrayan-2 Mission

Post by UlanBatori »

Still too soon to give up but it looks more and more like there is much more wrong than simply falling on one side. May be time to explore getting the next one launched, direct to the Moon with just a V-Kram2Lite along with the Jaspreet. If it were up to me I would remove the Pragyan for now, to save weight, time and complexity. A couple of instrumented balls with cellphones inside can be rolled out if everything works, and will cover more ground in a few seconds, than the Pragyan would in several days. And send back nice videos in color straight to your WHATSApp. Put a D-Link or CISCO router on the V-Kram2Lite. Forget the test-tube wallahs.

Also pooch 2 the RAAkit-Agints:

AAR-Biter-1 went up using a complicated 3- or 4-stage AAR-bite boost. Per my madarssa fatwas, if u break one Man-bin-Hoh Transfer into infinite leettil ones, the delta-V required goes up by factor of 2. So, by madarssa inter-polation, 3 or 4 is >>> and closer to infinity. So! Delta-Vee must be higher than the minimum. I think they broke it into 4 to reduce risk ("Indecision is the key to Flexibility, i think." etc).
Also, no need to delay for 2 weeks etc. Time is a-wasting.

IOW, maybe this shot should be a simple 2-Hohman-Transfer from the GTO booster into lunar capture orbit, followed by any needed plane-change and very direct slowdown to the surface point. Any point where comm can be established with the AAR-Biter-1. Sounds so E-Z when I type it, but I think ppl with STK running on their abacus should be able to figure out the optimum down to a sign error, in no time at all.

I have heard that the Team Indus in Loonar X-Prize was all set to send a fairly significant payload to the Moon using a PSLV. Any hope to use the PSLV for this, perhaps?
Last edited by UlanBatori on 16 Sep 2019 21:29, edited 1 time in total.
Haridas
BRFite
Posts: 881
Joined: 26 Dec 2017 07:53

Re: Chandrayan-2 Mission

Post by Haridas »

SSSalvi wrote:After thorough testing & confirmation of logic in C etc, final coding is converted to manually written machine code embedded in Space qualified ROMs/PLAs.
Wow hearing about PLA after so many decades. Almost forgotten it existance.

I built my redundent INS & Flight control system in IIT dilli where all burnt on ROMs.
Ravi Karumanchiri
BRFite
Posts: 723
Joined: 19 Oct 2009 06:40
Location: www.ravikarumanchiri.com
Contact:

Re: Chandrayan-2 Mission

Post by Ravi Karumanchiri »

^^^^^^^^^^^
Python supports Computer Vision. The entire mission relied on imagery scanned by the orbiter and even by the lander on it's way down. 'Crunching' that in real-time is a feat for Python onleee.

With Python, robotics and other factory automation solutions are more easily paired, when everything is driven by CV and carefully placed 'registration' stickers, the machines can line-up autonomously and pass parts between them, sans computer programmers instructing the robots on how to do it (other than scan for stickers and align orientations accordingly).

Onboard logic cued by 3D imaging referenced to a target map largely generated with fresh imagery while in orbit. Nobody is querying a database on a server farm on earth, all the way from the moon. Python onleeee.
Ashokk
BRFite
Posts: 1121
Joined: 11 Aug 2016 06:14

Re: Chandrayan-2 Mission

Post by Ashokk »

Nasa’s lunarcraft LRO will take images of Vikram lander today
NEW DELHI: US space agency Nasa will try to take images of Vikram lander, lying motionless on Moon’s surface, with the help of its Lunar Reconnaissance Orbiter (LRO) on Tuesday. The US lunarcraft, which has been circling Moon since 2009, will fly over the lander’s site.
The images from LRO will help Indian Space Research Organisation (Isro) know the exact status of Chandrayaan-2’s lander and aid the Indian agency’s efforts to establish contact with it. Vikram had ‘hard-landed’ on Moon’s south pole on September 7 during the 15-minute final descent.
LRO was also close enough during Vikram’s landing attempt to gather data about the manoeuvre using its lyman-alpha mapping project instrument, according to space.com. Reacting to the LRO being used to spot Vikram, the PMO tweeted, “Remembering the ISRO spirit.”
Besides the use of LRO, Nasa has been using its deep space network ground stations to sending hello messages to Vikram lander. “Yes, NASA/JPL is trying to contact Vikram through its deep space network (DSN) as contractually agreed with Isro,” a source in Nasa had confirmed to TOI.
Meanwhile, two foreign astronomers who have been keeping a constant watch on the orbiter claimed that its altitude has come down to around 90km from the 100km circular orbit. Astronomer Edgar Kaiser tweeted, “The left picture (showing a graphic) shows Chandrayaan_2’s 90 km orbit around the current full moon. Note how close to the surface the trajectory is in relation to moon’s diameter. In a week we will have the right picture. Obviously this orbit was chosen to keep the spacecraft in constant sunlight.”
Another astronomer Scott Tilley, who earned fame for finding a Nasa spy satellite that got lost in space in 2005, also tweeted, “As we gear up for observations tonight, it appears Chandrayaan2 may have made a slight orbital adjustment lower its nominal altitude to 92km.” TOI’s repeated attempts to contact the Isro chairman’s office to know about the orbiter’s altitude have failed.

LRO is equipped with a high-resolution camera that it was able to take aerial images of the Apollo landing sites with enough clarity to pinpoint the astronauts’ four-decade-old footprints. The same camera will be used in Tuesday’s attempt to capture Vikram images. The LRO maps Moon and looks for resources that could be valuable for future manned missions there.
Earlier, Isro had taken the help of Nasa’s deep space network to track the journey of Chandrayaan-2 integrated module while it was on its way to Moon from Earth’s elliptical orbit. Nasa’s JPL has DSN ground station in three places —Goldstone, South California (US), Madrid (Spain) and Canberra (Australia). The three stations are located 120 degrees apart on Earth with an aim to ensure that any satellite in deep space is able to communicate with at least one station at all times.
UlanBatori
BRF Oldie
Posts: 14045
Joined: 11 Aug 2016 06:14

Re: Chandrayan-2 Mission

Post by UlanBatori »

^^ (CV discussion)

CV is what I wonder about. Did it miss a tall sharp range/ crater rim that happened to be in the way to the nice flat landing site? The first reported fatal Tesla crash occurred when the CV on the $100K Tesla moving at maybe 30m/s on flat land, mistook a nice white (or blue or gray?) side of an 18-wheeler cutting across the road, for the broad wild White/Blue/Gray Yonder. Didn't even slow down, Driver was decapitated, IIRC. And that was in terrain where there was more light, far more color diversity, and much closer range, and cameras with not much weight restriction, and huge bandwidth, than the imaging thingy on the RAA-Biter-1 looking down from many kilometers above a dry dreary desert in very dim pre-dawn light, while zipping by at a couple of km/second (no idea what is orbital speed so close to the Moon, sorry, but I bet it is over a few hundred m/s?)

Think about it. If coupled with some error on the altitude-calculator, that becomes fatal. Best explanation for Sudden Loss Of Signal (SLOS).
ramana
Forum Moderator
Posts: 59799
Joined: 01 Jan 1970 05:30

Re: Chandrayan-2 Mission

Post by ramana »

Mort Walker wrote:
srin wrote:
There is much more to Chandrayaan 2 than just the Vikram lander. It has dual band SAR, a very high res camera and other scientific payloads.

My request is to keep the thread open.
It will be very slow for information to come out and better to roll this thread into other ISRO discussions.

At least till FAC report comes in will keep this thread open.
neerajb
BRFite
Posts: 853
Joined: 24 Jun 2008 14:18
Location: Delhi, India.

Re: Chandrayan-2 Mission

Post by neerajb »

Too much importance being given to Python which I believe is just another programming language and a relatively lousy one. All the ML/DL that it is being credited with is actually the handiwork of frameworks like Tensorflow which happens to have a C++ core and expose the Python APIs so that not so technical people can do the training/inferencing using minimal tech background using Python. Python is easy to use, popular and the in thing nowadays. Else I don't see a reason why the same can't be done in C++ or other languages much more efficiently and fast. CNN is relatively faster which is used extensively in image classification. I don't know what exactly ISRO is using, but I would any day choose native APIs instead of Python wrappers to get as much speed as possible. If processing power is limited and extreme speed is needed, I would write my own custom solution for CNN in C++ (or assembly if not too complex) rather.

It's like slapping F414 (GPU) to SH (Python) to try to match performance like, say, a Mig-29K (C/C++).

Cheers....
Last edited by neerajb on 16 Sep 2019 23:56, edited 1 time in total.
Ravi Karumanchiri
BRFite
Posts: 723
Joined: 19 Oct 2009 06:40
Location: www.ravikarumanchiri.com
Contact:

Re: Chandrayan-2 Mission

Post by Ravi Karumanchiri »

Is there any way to calculate the surface effects of those four powerful 800 N engines, on the way down? (Or the single, center one, at the last moment?) Is it possible that they kicked-up enough lunar dust to confound the imaging and insert errors into calculations? Could a rising dust cloud confuse altitude calculations? Could a bad altimeter reading or a loss of map correlation occur; because of kicked-up lunar dust? How far would the engine wash produce sufficient force to kick-up lunar dust? Could the imaging SAR and optics see-through this? Could it have lead to an odd landing, perhaps on-top of a pointy rock that wasn't conducive to landing, thus the tipped-over orientation?

Also: What would happen if the "surface" to be landed on was not hard at all, but a deep pocket of fine, loose dust; which would be blasted-away by rocket motors and effectively dig-in (perhaps unevenly, which might explain the leaning angle)...........

I seem to remember another lander design by some other space agency; when the lander reached a certain, very-low altitude; a number of gas generators would fire and inflate a whole cluster of air bags; as if encasing the lander in a bunch of balloons that would cushion and absorb the initial impact. After touchdown, there is much bouncing until the lander comes to a stop. Controlled or sequential deflation of the bag-balloons carefully touches the lander down, with all of the kinetic energy from landing completely zeroed-out, in terms of mission risk.


FWIW.....
What programing languages are used at ISRO?
Vayutuvan
BRF Oldie
Posts: 12083
Joined: 20 Jun 2011 04:36

Re: Chandrayan-2 Mission

Post by Vayutuvan »

Ravi Karumanchiri wrote:^^^^^^^^^^^
Python supports Computer Vision. The entire mission relied on imagery scanned by the orbiter and even by the lander on it's way down. 'Crunching' that in real-time is a feat for Python onleee.
Ravi garu, what do you mean by "support"? Any Turing Complete language "supports" CV. In fact, it can support the computation of any general recursive function. Python is a Turing complete language.

That is a very weak condition in that it doesn't say anything about the asymptotic time complexity.

Assuming the same algorithm is implemented once in Python and once in, say C (even if non-optimized), the latter would beat the former handily due to interpretation overheads which manifest as constants hidden in the O/Omega/Theta notation.

Python merely does the job as glue code. It is marginally better than bash/sh/csh.

Most of the number crunching work is done in libraries written in C or assembly or hardware implementations in PLAs. FPLAs might be used if some amount of model-based arithmetic coding for compression is used.
Last edited by Vayutuvan on 17 Sep 2019 00:47, edited 1 time in total.
ArjunPandit
BRF Oldie
Posts: 4056
Joined: 29 Mar 2017 06:37

Re: Chandrayan-2 Mission

Post by ArjunPandit »

Ravi Karumanchiri wrote:^^^^^^^^^^^
Python supports Computer Vision. The entire mission relied on imagery scanned by the orbiter and even by the lander on it's way down. 'Crunching' that in real-time is a feat for Python onleee.

With Python, robotics and other factory automation solutions are more easily paired, when everything is driven by CV and carefully placed 'registration' stickers, the machines can line-up autonomously and pass parts between them, sans computer programmers instructing the robots on how to do it (other than scan for stickers and align orientations accordingly).

Onboard logic cued by 3D imaging referenced to a target map largely generated with fresh imagery while in orbit. Nobody is querying a database on a server farm on earth, all the way from the moon. Python onleeee.
could be ....python is unparallelled when it comes to available libraries..but if its real time i have my doubts..i have not seen many FS using python for real time pricing or other applications..C++/Java rule in that world..hopefully an institutional organization like ISRO should be having a host of libraries for this purpose...
ArjunPandit
BRF Oldie
Posts: 4056
Joined: 29 Mar 2017 06:37

Re: Chandrayan-2 Mission

Post by ArjunPandit »

Ravi Karumanchiri wrote: FWIW.....
What programing languages are used at ISRO?
quora doesnt open in office..can anyone post for the lesser mortals...
ArjunPandit
BRF Oldie
Posts: 4056
Joined: 29 Mar 2017 06:37

Re: Chandrayan-2 Mission

Post by ArjunPandit »

coming to the computer vision part, is it the the autonomous lander? are you sure sir? I would not bet that anything would be CV driven in the landing stage...or am i missing something...i do remember the autonomous part of lander..but would assume it would be sensor driven
i would rate that as a very risky proposition to land an orbiter for the first time using Computer vision, instead of mission controller or other sensors ..its not even that far from earth so that they need to rely on CV...CV has a huge element of predictiveness built in. I doubt ISRO has that data to predict it in those conditions. Even in broad daylight google's moonshot competition for a self drive car coudlnt distinguish between a pebble and a bird at distance (ok its been 5+years)..and here we are talking about far faster speed and in different light,atmospheric (no air etc on moon).
Ramnath
BRFite -Trainee
Posts: 5
Joined: 28 Feb 2019 09:41

Re: Chandrayan-2 Mission

Post by Ramnath »

I do not claim to be even a half-decent coder. But glancing through the GitHub code snippets, I don't see any indication that the code is what runs on CY-2. Seems more to be data analysis for the radio telescope data. Can someone point me to which section they think is CY-2 code?
Ravi Karumanchiri
BRFite
Posts: 723
Joined: 19 Oct 2009 06:40
Location: www.ravikarumanchiri.com
Contact:

Re: Chandrayan-2 Mission

Post by Ravi Karumanchiri »

FROM QUORA (linked above), UNEDITED.................................. (just sayin')


What programming languages are used at ISRO?
4 Answers
Prateek Sharma
Prateek Sharma, Scientist at Indian Space Research Organisation (2014-present)
Updated Aug 31, 2017 · Upvoted by Anuj Jagtap, works at Indian Space Research Organisation and Amit Upadhyay, former Scientist at Indian Space Research Organisation (2012-2018)
Originally Answered: what kind of programming languages are used in ISRO ?

It depends upon in which ISRO centre/area/group/division you are working on . ISRO is a very large organization and it has 16 centres all over India.

As far as my experience in ISRO is concerned, I am working in Atmospheric and Oceanic Sciences group in Space Applications Centre, Ahmedabad, ISRO. Here people are working mostly in

FORTRAN : FORTRAN is the most popular language in our group because of it’s high speed and efficiency. This language is made specifically for numerical computation. Almost all Numerical Weather Prediction models have been written in FORTRAN.
MATLAB : Matlab is being used extensively for graphics and plotting purposes. But it’s a commercial software. People are encouraged to use free and open source software in order to cut the unnecessary expenses.
Python : Being a general purpose programming language and it’s versatile nature, python is gaining more popularity than any other language in our community. It can be effectively utilized in post processing of satellite data. I work mostly in python.

However in other departments, people also use C, C++, Java, Javascript, Perl etc.

The best thing about working in ISRO is, you are not bound to learn any specific language. You are free to do work in any language in which are comfortable.

If you can produce desired output and results, languages doesn’t matter.

My advice :

Learn Python, it has the fastest growing community in the world. Other languages have limited scope and made for specific fields like FORTRAN for meteorology, C,C++ for software design , Java for web design etc. Python is like an all rounder, it can easily fit in all fields. It’s potential is limitless.

Weather you join ISRO or not, learning python will help you in all your future endeavors as a programmer. Even Quora has been written in Python.

PS
6.8k views · View 65 Upvoters · View Sharers · Answer requested by Dhritiman Chakraborty
Related Questions
More Answers Below

What are the technologies used in ISRO at present?
I am fresher who wants to get a job at ISRO. What programming languages should I learn?
Which programming languages are used by NASA?
What is the history of ISRO?
What are the various programming languages that space organisations like ISRO and NASA use?

Tirtha Chakrabarti (তীর্থ চক্রবর্তী)
Tirtha Chakrabarti (তীর্থ চক্রবর্তী), studied Theoretical Physics
Answered Apr 4, 2017 · Upvoted by Siddani Srinivasa Rao, Scientist/Engineer-sc at Indian Space Research Organisation (2017-present) and Manika Nagpal, former SRFP fellow (Indian Academy of Sciences) at Indian Space Research Organisation (2016) · Author has 693 answers and 1.5m answer views

ISRO needs and develops software to run the computer network, systems and simulations. Apart from various softwares/applications developed at centers like Space Application Center, ISRO has a network of centers and ground-stations and this network is managed by ISTRAC. ISRO uses both Linux- and Windows-based platforms and develops for both too. JAVA is the favored language, but C, C++, C#, Python, Perl etc. are also used- whatever gets the work done.

Software is also required for running various systems. In satellites, different programming languages are used at different levels depending on the use of processor. At many places basic programming languages like assembly language and C++ with FORTRAN is used for mechanical movements. (These were used in Mars orbiter, for example). For data handling FPGA is widely used at different places (VHDL is used). for the Attitude and Orbit Control System, ADA can be used (so also related to PASCAL and other languages). For nanosatellites, C, Lua and Python can be used.

ISRO has to use software for the supercomputer simulations and centers do it as per requirements.
6.9k views · View 9 Upvoters
Virat Puar
Virat Puar, Love to give instructions to machines
Answered Sep 7, 2015
Originally Answered: what kind of programming languages are used in ISRO ?
Well the the ISRO and other space agencies like NASA specially used programming languages for Satellites. They mostly used ADA,C and C++ Guide for the Use of Ada Programming Language in High Integrity Systems like Satellite and the ESA Ada Coding Standard are available on ESA - Software engineering and standardisation - Coding Languages. The C language is used for payloads, for Digital Signal Processing software and for small instruments.
3.7k views · View 6 Upvoters
Pankaj Jahagirdar
Pankaj Jahagirdar, Computer Engineering Student
Answered Jan 16, 2017 · Author has 217 answers and 339.9k answer views
Originally Answered: what kind of programming languages are used in ISRO ?

Refer to this blog post:

http://blog.hackerearth.com/2014...
1.3k views · View 1 Upvoter
Related Questions

What are the technologies used in ISRO at present?
I am fresher who wants to get a job at ISRO. What programming languages should I learn?
Which programming languages are used by NASA?
What is the history of ISRO?
What are the various programming languages that space organisations like ISRO and NASA use?
What programming languages are used at Bosch?
How is Python used in ISRO?
How is ISRO used nowadays?
What is ISRO? What’s its work?
What type of language is used at ISRO interview?
Where is ISRO located?
What is the 'Mangalayan' space program of ISRO?
What is the new program ISRO has recently started?
WHAT PROGRAMMING LANGUAGES ARE BEST FOR AEROSPACE ENGINEERS ?
Where are centers of ISRO?
Vayutuvan
BRF Oldie
Posts: 12083
Joined: 20 Jun 2011 04:36

Re: Chandrayan-2 Mission

Post by Vayutuvan »

neerajb wrote:... Tensorflow ...
Most of the low-level code happens to be IMKL (Intel Math Kernel Libraries) if Intel processors are used. Other vendors have their LAPACK/BLAS libraries. Most image processing algorithms are linear transformations using small dense matrices. They are embarrassingly parallel.

Tensorflow is a resource hog, IMHO.

At the moment the interpreted language with good language design (orthogonality, abstraction, syntax, and semantics) is Julia. It also uses JIT compilation (from LLVM Compiler infrastructure). Only problem with LLVM at the moment is that compiler optimization is not as good ICC (Intel c compiler), MSVC, or GCC. Best optimized code for Intel/AMD is produced by ICC. MSVC is not too far behind. GCC is dead last.

Added later: Intel Compiler is based on the compilers developed by Kuck and Associates, Inc (KAI) compiler called KCC. KAI was bought by Intel.

Kuck was a professor at Urbana. So is the LLVM team.
Last edited by Vayutuvan on 17 Sep 2019 01:20, edited 1 time in total.
Vayutuvan
BRF Oldie
Posts: 12083
Joined: 20 Jun 2011 04:36

Re: Chandrayan-2 Mission

Post by Vayutuvan »

Ravi Karumanchiri wrote:... a number of gas generators would fire and inflate a whole cluster of air bags; as if encasing the lander in a bunch of balloons that would cushion and absorb the initial impact.[/url]
What is the material the balloons are made up of? I bet some very fancy composite metamaterial is required. Remember, Moon doesn't have any atmosphere. So there won't be any reaction force - other than the internal forces of the balloon material itself - to stop the ballons expanding till the membrane gives out.
UlanBatori
BRF Oldie
Posts: 14045
Joined: 11 Aug 2016 06:14

Re: Chandrayan-2 Mission

Post by UlanBatori »

other than the internal forces of the balloon material itself - to stop the ballons expanding till the membrane gives out.
Mars surface is like 33km above Earth. Not much pressure there either. Maybe 0.01 of surface. Balloons fly there quite happily. In fact in the 1950s the Echo 1 and 2 IIRC were mylar balloons placed in LEO.
UlanBatori
BRF Oldie
Posts: 14045
Joined: 11 Aug 2016 06:14

Re: Chandrayan-2 Mission

Post by UlanBatori »

ArjunPandit wrote:coming to the computer vision part, is it the the autonomous lander? are you sure ? I would not bet that anything would be CV driven in the landing stage...or am i missing something...i do remember the autonomous part of lander..but would assume it would be sensor driven
i would rate that as a very risky proposition to land an orbiter for the first time using Computer vision, instead of mission controller or other sensors ..its not even that far from earth so that they need to rely on CV...CV has a huge element of predictiveness built in. I doubt ISRO has that data to predict it in those conditions. Even in broad daylight google's moonshot competition for a self drive car coudlnt distinguish between a pebble and a bird at distance (ok its been 5+years)..and here we are talking about far faster speed and in different light,atmospheric (no air etc on moon).
At or approaching the hover stage, there is "pattern matching" to determine if they are at the right place and whether it is safe to put down. That surely appears to be CV, hain? No time for relaying image back and having a Blue Ribbon Commishun with Retired High Court Judges decide etc. The computer decides. Also someone here declared that the entire landing etc is TOTALLY automated, so it must depend on pre-loaded imagery if it is going to compare.

Plus, there was an upload of trajectory about 1 hour or so before landing, which is what I worry about. That was based on analysis of imagery, wasn't it?

People talk about the final landing stage, but the loss of comms occurred while there was still significant horizontal speed. My question is, what if the Height Above Ground Level was not really what they thought it was? Ever seen a cartoon which shows a flight crew commenting:
Hey! what's Ayesha the Mountain Goat doing, way up in this cloudbank?
Not funny I know, with all these Serious Anti-Pingreji Experts here racing each other to Report poor me, but the above is precisely what would explain the sudden loss of comms "way up in this cloudbank".
UlanBatori
BRF Oldie
Posts: 14045
Joined: 11 Aug 2016 06:14

Re: Chandrayan-2 Mission

Post by UlanBatori »

To extend that nightmare: With the sun so shallow, the shadows are VERY long. There are basically two colors on the images: Bright (sunny) and Dark (shade). How do you distinguish between a cliff 500m high with an 85 degree slope on the sunny side and a gentle slope on the shadow side, from a ledge 50m high with a much shallower slope on the sunny side? You might "approve" an approach from either side with drastic consequences. I am sure they thought of such trivia but obviously SOMETHING went wrong...
ramana
Forum Moderator
Posts: 59799
Joined: 01 Jan 1970 05:30

Re: Chandrayan-2 Mission

Post by ramana »

Folks discussion of computer lanfiages is not germane to this thread.

Thanks, ramana
NRao
BRF Oldie
Posts: 19236
Joined: 01 Jan 1970 05:30
Location: Illini Nation

Re: Chandrayan-2 Mission

Post by NRao »

ArjunPandit wrote:coming to the computer vision part, is it the the autonomous lander? are you sure sir? I would not bet that anything would be CV driven in the landing stage...or am i missing something...i do remember the autonomous part of lander..but would assume it would be sensor driven
i would rate that as a very risky proposition to land an orbiter for the first time using Computer vision, instead of mission controller or other sensors ..its not even that far from earth so that they need to rely on CV...CV has a huge element of predictiveness built in. I doubt ISRO has that data to predict it in those conditions. Even in broad daylight google's moonshot competition for a self drive car coudlnt distinguish between a pebble and a bird at distance (ok its been 5+years)..and here we are talking about far faster speed and in different light,atmospheric (no air etc on moon).
The Orbiter/Lander had made multiple passes and scanned the surface. IIRC, ground control had selected a primary and a secondary landing area. The point being they had a good "picture" of the area that the Lander was expected to land at. So, the pixels were loaded into some form of memory - I would imagine.
ArjunPandit
BRF Oldie
Posts: 4056
Joined: 29 Mar 2017 06:37

Re: Chandrayan-2 Mission

Post by ArjunPandit »

nraoji, thanks for pointing this out...serious question, wouldnt the resolution and camera be different given the different velocities. Automated operations for such operations would require at least 100s of images for automated verification...slightest perturbations in camera can make the matrix look completely different and result in huge variations, which an algo/model trained on insufficient data will reject as match and will take other alternative or corrective actions...i agree they might have other images..but there would be too many variables at play. Pardon my ignorance, if i am asking very stupid questions

ubji, yes the planned trajectory was loaded...and thinking from basic principles..that would have been outcome of some forecast..under given inputs..the inputs could be deterministic or stochastic....dont have much idea if space landings make use of intermediate stats e.g. confidence intervals..but assuming that computer vision would be perfect for such cases sounds a giant leap of faith...based on what i have seen it takes quite a large no. of images to train even a cat/dog type of feature..obviously things didnt go as expected, but even if they did, under current set info, i would still say it is a very high risk strategy. Not that i dislike it, but if it were upto me I would not take that approach under any circumstances..
also,
basic pooch ...why would the sun be shallow? location of landing zone, landing time or the atmospheric effect
also, ironically at the time of landing i was watching 'angel has fallen'
ArjunPandit
BRF Oldie
Posts: 4056
Joined: 29 Mar 2017 06:37

Re: Chandrayan-2 Mission

Post by ArjunPandit »

la.khan wrote:
chetak wrote:twitter

ramana guru this is ot, please delete if need be
Is that Github link pointing to snippets of source code of software that runs/controls CY2? It was coded in Python? Wow, just wow! :eek:
this to me seems by some enthusiast if not a silly govt aagint who forgot to delete things..e.g., see wee anal isis of failed iran launch..

https://github.com/tammojan/satellite_analysis

i suspect one of the following
1. he's analyzing the images that he/she might have downloaded from internet during the CY2 launch..
2. specifically regarding CY2 it may be an attempt to peer inside what went wrong by chinese, there are multiple references to chinese users in the associated codes and libraries ...
3. to me it seems whosoever is doing this thing is no amateur..but then no intel guy too coz it would be really stupid to leave everything public..
UlanBatori
BRF Oldie
Posts: 14045
Joined: 11 Aug 2016 06:14

Re: Chandrayan-2 Mission

Post by UlanBatori »

ArjunPandit wrote: basic pooch ...why would the sun be shallow? location of landing zone, landing time or the atmospheric effect
also, ironically at the time of landing i was watching 'angel has fallen'
Sun is always pretty shallow at 70.9 deg. lat. on Earth or Moon. So shadows bound to be long. That is the whole basis for ice surviving hajaar^3 saal in craters, hain? But if you are verifying surface features 1 hour ahead of landing, then that was like 8:30AM on a winter din. Very shallow. Which is why I was hoping that by now the sun would have come up as high as it will come up and maybe woken up the solar panels enough to charge the battery and power up the transmitter. :(

Although I must say that the moon looks as brightly yellow at 70 deg. as at the equator when I look up on fullmoon raat. No idea why that is so.
Ravi Karumanchiri
BRFite
Posts: 723
Joined: 19 Oct 2009 06:40
Location: www.ravikarumanchiri.com
Contact:

Re: Chandrayan-2 Mission

Post by Ravi Karumanchiri »

I realize this animation below concerns Mars, and things would have to work differently for the moon; but 'still and all' this shows the usage of those balloon/bags I mentioned above....

Vayutuvan
BRF Oldie
Posts: 12083
Joined: 20 Jun 2011 04:36

Re: Chandrayan-2 Mission

Post by Vayutuvan »

ArjunPandit wrote:nraoji, ... which an algo/model trained on insufficient data will reject as match ...
"trained" - that is the operating keyword. I don't think NNs/other machine learning mechanisms are fast enough. My educated guess is that they would not be using any heuristics as weak as NN/ML.

Your second paragraph is right on the dot. NNs/ML are very approximate methods.
Vayutuvan
BRF Oldie
Posts: 12083
Joined: 20 Jun 2011 04:36

Re: Chandrayan-2 Mission

Post by Vayutuvan »

Ravi Karumanchiri wrote:I realize this animation below concerns Mars, and things would have to work differently for the moon; but 'still and all' this shows the usage of those balloon/bags I mentioned above....

[youtube...]kSbAUtyO7xo[/youtube]
Let us wait for a mech engr./materials scientist to weigh in on this. I don't think this will work for the moon.
vnms
BRFite
Posts: 196
Joined: 15 Aug 2016 01:56

Re: Chandrayan-2 Mission

Post by vnms »

UB Saar, I think things got off script way before the lander could even start taking pics of the final landing. The lander was programmed to hover at 500 mts and then start taking pics and figuring out the correct landing spot.

So, the scenario you describe would not have occured this time.
rahulm
BRFite
Posts: 1257
Joined: 19 Jun 2000 11:31

Re: Chandrayan-2 Mission

Post by rahulm »

Russian Luna 9 in 1966 was the first ever probe to soft land on the moon, it used landing bags
NRao
BRF Oldie
Posts: 19236
Joined: 01 Jan 1970 05:30
Location: Illini Nation

Re: Chandrayan-2 Mission

Post by NRao »

ArjunPandit wrote:nraoji, thanks for pointing this out...serious question, wouldnt the resolution and camera be different given the different velocities. Automated operations for such operations would require at least 100s of images for automated verification...slightest perturbations in camera can make the matrix look completely different and result in huge variations, which an algo/model trained on insufficient data will reject as match and will take other alternative or corrective actions...i agree they might have other images..but there would be too many variables at play. Pardon my ignorance, if i am asking very stupid questions

....................
No idea. I can at best guess.

Having said that CY2 set was designed some years ago. Let us say 5 years ago. That is the tech I would look at. Not what is available today. In fact, I would turn back the clock even further, space guys are very reluctant to use techs they are not familiar with or familiar with and unproven (in their own minds).

My best guess is that the orbiter took some hi-res pictures (whatever hi-res means) and selected 3-4 points around the location selected to land. Converted that to pixels (heck even NN are pixel-based - vectors) all one needs (guessing here too) are those 3-4 points for guidance. And, "guidance" does not have to real-time. It can be - and I would think this is a better approach - every X cycles/seconds figure out where the lander is and adjusts accordingly. After all, the lander has very limited resources (CPU/battery/etc)




For the latest and greatest:

Entry, Descent, and Landing Technologies

From the NASA's Mars 2020 mission.

There is a nice terrain-relative navigation gif in there
How Terrain-Relative Navigation Works

* Orbiters create a map of the landing site, including known hazards.
* The rover stores this map in its computer "brain."
* Descending on its parachute, the rover takes pictures of the fast approaching surface.
* To figure out where it's headed, the rover quickly compares the landmarks it "sees" in the images to its onboard map.
* If it's heading toward dangerous ground up to about 985 feet (300 meters) in diameter (about the size of two professional baseball fields side by side), the rover can change direction and divert itself toward safer ground.

Why Terrain-Relative Navigation is Important

Terrain-Relative Navigation is critical for Mars exploration. Some of the most interesting places to explore lie in tricky terrain. These places have special rocks and soils that might preserve signs of past microbial life on Mars!

Until now, many of these potential landing sites have been off-limits. The risks of landing in challenging terrain were much too great. For past Mars missions, 99% of the potential landing area (the landing ellipse) had to be free of hazardous slopes and rocks to help ensure a safe landing. Using terrain relative navigation, the Mars 2020 rover can land in more - and more interesting! - landing sites with far less risk.

How Terrain-Relative Navigation Improves Entry, Descent, & Landing
Terrain-Relative Navigation significantly improves estimates of the rover's position relative to the ground. Improvements in accuracy have a lot to do with when the estimates are made.

In prior missions, the spacecraft carrying the rover estimated its location relative to the ground before entering the Martian atmosphere, as well as during entry, based on an initial guess from radiometric data provided through the Deep Space Network. That technique had an estimation error prior to EDL of about 0.6 - 1.2 miles (about 1-2 kilometers), which grows to about (2 - 3 kilometers) during entry.

Using Terrain-Relative Navigation, the Mars 2020 rover will estimates its location while descending through the Martian atmosphere on its parachute. That allows the rover to determine its position relative to the ground with an accuracy of about 200 feet (60 meters) or less.

It takes two things to reduce the risks of entry, descent, and landing: accurately knowing where the rover is headed and an ability to divert to a safer place when headed toward tricky terrain.
I wonder if Vikram had similar or the same logic.
UlanBatori
BRF Oldie
Posts: 14045
Joined: 11 Aug 2016 06:14

Re: Chandrayan-2 Mission

Post by UlanBatori »

The Ayesha-In-Cloudbank scenario (AICS) is where you THINK situation is Naarmal and show nice trajectory that matches ideal, but you are actually 0.5 km closer to the ground than you realize. While still moving pretty fast in the tangential component.
This is known, IIRC as
Controlled Flight Into Terrain.
Very sudden end with no warning.
But of course ISRO claims to have seen a more-or-less intact Vikram on the ground. So I wonder. What was the last displayed value of tangential speed? That was probably accurate. LRO portrait may be released tomorrow?
Vayutuvan
BRF Oldie
Posts: 12083
Joined: 20 Jun 2011 04:36

Re: Chandrayan-2 Mission

Post by Vayutuvan »

rahulm wrote:Russian Luna 9 in 1966 was the first ever probe to soft land on the moon, it used landing bags
thanks for the pointer. some folks dismissed these ideas as "flights of fancy". it all depends on the initial boundary value problem and the material properties of the material the balloon complex is made up of.
Post Reply