Sunday, 5 February 2017

ISO 26262 - Dependent Failure Analysis (DFA)

ISO 26262 - Dependent Failure Analysis (DFA):


Dependent failure analysis aims at identifying failures that may hamper the required independence or freedom from interference between given elements (hardware/ software/ firmware) which may ultimately lead to violation of safety requirement or safety goal.

Hence Dependent Failure Analysis consists of following 2 parts.
  • Validate Freedom from Interference (FFI) between elements.
  • Validate Independence between elements



Difference between Common Cause Failures and Cascading Failures:

Cascading Failures:

  • Failure of an element of an item results in failure of another element or element of the same to fail.
  • Cascading failures are dependent failure that are not common cause failures.

Common Cause Failures:

  • Failure of 2 or more elements of an item resulting from single specific event or root cause.
  • Common cause failures are dependent failure that are not cascading failures.


Dependent failure types:

Dependent Failures can arise from systematic failures and random hardware failures.



How to identify Dependent Failure Analysis?



Dependent Failures can be identified from Safety Analysis.

Deductive analyses:


  • Examination of cut sets or repeated identical events of an FTA can indicate potential for dependent failures.

Inductive analyses:

  • Similar parts or components with similar failure modes that appear several times in an FMEA can give additional information about the potential for dependent failures.


DFA Part 1: Freedom from Interference (FFI):

Analysis of interactions between software elements:

  • The FFI should be analyzed between software elements (determined during software architecture design).
  • Since the FFI is based upon analysis of only cascading failure, hence the data exchange between software elements should be analyzed.
  • The ASIL level of software elements (source and destination elements) between which data exchange takes should be analyzed.
  • In case the data exchange occurs between originating from lower level ASIL element to higher level ASIL element, then such interactions should be marked for analysis.
  • These data exchanges should be analyzed for freedom from independence.
  • In case plausibility is proved then mechanisms to prevent, detect and mitigate relevant faults are assessed during analysis.

Analyze the interaction between software elements for following possible failures.

Timing and Execution:

  • blocking of execution.
  • deadlocks – several processes blocking mutually by waiting for events that can be triggered by themselves.
  • Livelocks – several processes keeping each other in infinite loop.
  • incorrect allocation of execution time.
  • incorrect synchronization between software elements.

Memory:

  • corruption of content.
  • read or write access to memory allocated to another software element.

Exchange of Information:


  • repetition of information.
  • loss of information.
  • delay of information.
  • insertion of information.
  • masquerade or incorrect addressing of information.
  • incorrect sequence of information.
  • corruption of information.
  • asymmetric information sent from a sender to multiple receivers.
  • information from a sender received by only a subset of the receivers.
  • blocking access to a communication channel.

Table 1: Analysis of interaction between software elements for possible failures.


Table 2: Analysis of interaction between software elements for possible failures.

DFA Part 2: Independence


Identification of couples:


Based upon above factors architectural units can be identified to form couples to prove independence amongst them. 
The couples can be identified based upon following factors.
  • Similar and dissimilar redundant elements
  • Different functions implemented with identical software or hardware elements
  • Functions and their respective safety mechanisms
  • Partitions of functions or software elements
  • Physical distance between hardware elements, with or without barrier
  • Common external resources

The above criteria are described in details.


Similar and dissimilar redundant elements




In this example, 2 redundant software functionalities are implemented using different algorithms to provide an output (SO1 - which is Safety related).

SWF1: Multiplication using actual multiplication operatorSWF2: Multiplication using repeated addition operator

So, to determine the integrity of this output (SO1), both SWF1 and SWF2 needs to be identified as a couple. These can form a couple since they are used to provide a single output, which may fail due to following reasons.

SWF1 block failure (like SWF1 function not completely called due to interrupt function call, memory corruption of the code flash area where SWF1 or any of its variable is stored etc.). Similar in case of SWF2.

Different functions implemented with identical software or hardware elements



In this example, 2 different software functionalities (SWF1, SWF2) are using same Firmware functionality (FWF1) algorithms to get the battery voltage and power source voltage for ECU2 from a common functionality (FWF1).

Both the software functionalities are implemented using a common/identical firmware element i.e. ADC firmware. Hence both these software elements can be considered as a couple.


In this example, the Power IC determines the voltage and current flowing through the Line and Neutral. Then the IC sends the values through SPI line to the Main micro controller for processing.

In this example, 2 different software functionalities (SWF2, SWF3) are using SPI driver firmware functionality (FWF1) to get the raw voltage and current values.

Both software functionalities (SWF2, SWF3) are then using the square functionality (SWF1) to determine the RMS voltage and current.

Since both SWF1 and SWF2 are different functions implemented through same software element (FWF1) hence both can be considered as a couple.

Functions and their respective safety mechanisms



In this example, the software functionality (SWF1) controls the Relay control IC (1,2) for live and neutral. The software functionality (SWF2) is used to detect the feedback from the voltage/current monitoring IC that monitors the flow of current. Hence once SWF1 controls the relay to stop current flow. SWF2 acts a safety mechanism for SWF1.

So, both SWF1 and SWF2 can be identified as a couple for analysis to check if there is any dependency.

Partitions of functions or software elements



In this example, SWF1 is Non-Safety related functionality that is located in block 1 of flash memory. SWF2 is Safety related functionality that is located in block 2 of flash memory. Both share data through a shared memory section accessible to both blocks. In case of modification of QM functionality will not affect ASIL level implementation. Such functionalities can be treated as a couple for analysis. Factors such as any impact while reading data from common memory should be analyzed for these couple.

Physical distance between hardware elements, with or without barrier



In this example, 2 hardware elements are under consideration (HWEL1, HWEL2). The Main microcontroller (HWEL1) controls the relay on off through the Relay Control IC (HWEL2). The Relay Control IC and Relays are so designed that in case of below scenarios the Relays are off. 

1. Power Failure to both Relay Control IC or Main Microcontroller
2. Relay Control IC Failure
3. Main Microcontroller IC failure (or Main Microcontroller is not able to communicate with Relay Control IC)

SO here in case of a hypothetical scenario when the main microcontroller gets fried due to a very high EMI/EMC radiation and the relays are not under control of the Main Microcontroller, the Relays should be in off state, so that the switches are closed. Hence here both HWEL1 and HWEL2 should be considered as a couple for analysis.

Common external resources



In this example, the signals like ignition, VSS (Vehicle Speed Sensor), RPM (Rotation per Minute) etc. are received from external sources for EPS module. The software functionalities within EPS module can use this external sources data for computation, hence these can form couples.


Table 3: Couple identifications


Dependent failures should be identified for the identified couples.
These potential failures should be identified for their plausibility of violation of independence.


Table 4: Failure type identification


Table 5: Detailed Analysis

Thursday, 26 January 2017


LIDAR: A Primer for Primates



 Why am I wasting your time?


Why has LIDAR (Light Detection and Ranging) become so important lately? Why should LIDAR bother you? Now why are you serious?

“All the answers lie out there in the sunshine.
You just need a magnifying glass to find out what’s a LIDAR.”

In this age of driver less cars, LIDAR is the angle that guides these to the destination and saves it from falling down over the horizon. LIDAR has become the eye while RADAR (Radio Detection and Ranging) has become the ears of today’s driver less cars.

So, let’s have a few bytes on LIDAR.


 A very brief history of time



Various Echo Location methods has been used since the evolution of life on earth. Have you wondered how a bat flies in dark night (of course without headlights) without crashing into another. This is what is called Echo Location. And since bats are serial generational illiterates, so humans just plagiarized their technology and created patents. Now the hard part is that bats have to pay royalty to humans every time they took a stroll at night.




The Echo Location is based upon this simple technique of bouncing of sound or any electromagnetic waves (like radio waves, light waves) against an object.

The RADAR, SONAR and LIDAR are based upon calculating the distance by measuring the time between transmitting and receiving the wave.




 Disadvantages of RADAR:

Master Yoda says “RADAR disadvantages following are”.
  • RADARs are expensive to mobilize while LIDAR is fairly smaller in size. RADAR is not suitable for small area survey as Airborne RADAR are expensive to operate.
  • RADAR cannot measure the ground elevation below the canopy cover. LIDAR is also capable of "seeing" between trees in forested areas.
  • RADAR has lesser accuracy than the LIDAR.
  • RADAR emits a broad range radiations while LIDAR emits a focused laser pulses.
  • Differential absorption LIDAR can also determine some of the chemical properties of the target which is not possible in case of RADAR.

 LIDAR Species:





















 LIDAR Family Tree



*IMU – Inertial Measurement Unit
*GPS – Global Positioning System


 What is inside LIDAR?


LIDAR, which stands for Light Detection and Ranging, is a remote sensing method that uses light in the form of a pulsed laser to measure ranges (variable distances) to the Earth.


The fundamental steps of LIDAR functionality are explained below.



Laser
  • Frequency:
50,000 ~2,00,000 pulses per second (Hz)
  • Wavelength:
Near Infrared (1040 ~ 1060 nm) – Terrestrial mapping
Blue Green (500 ~ 600 nm) – Bathymetry
Ultraviolet (250 nm) – Meteorology
Infrared (1500 ~ 2000 nm) – Meteorology
  • Wattage:
Low wattage (<1 W)

Scanner

  • Mirror spins or scans to project laser pulses to the surface.
  • Scanning angles up to 75o. Scanner measures the angle at which the pulse was fired.
  • Receives reflected back pulse.
High Precision Clock
  • Records the time the laser pulse leaves and returns to the scanner.
GPS (Global Positioning Systems)
  • Records the x, y, z location of the scanner.
  • Survey ground base station in flight area.
IMU (Inertial Measurement Unit):
  • Measures angular orientation of the scanner relative to the ground (Pitch, Roll, Yaw).





Principles of Distance Calculation


Pulse Range Distance:

  • The distance is calculated by comparing the time between pulse sent and the pulse received.
  • Pulse Range Distance (R).


Where
C: Speed of light (299,792,458 m/s)
t: Time interval between sending and receiving pulses (ns)



Point Cloud:
  • The x, y, z coordinates of each return are calculated using the location and orientation of the scanner (from the GPS, IMU), the angle of the scan mirror and the range distance of the object/surface.
  • The collection of returns is called a “Point Cloud”.




Resolution:
  • The number of pulses per unit area.
  • Current systems capable of 20 pulses/square meter.
  • Resolution determined by aircraft speed, Flying altitude, Field of View (FOV), Rate of pulse emission.
  • Higher resolution and a narrow FOV is needed to penetrate dense vegetation.
  • higher resolutions allow the surface and features on the surface to be better resolved, but at cost of larger data sets and slower processing times.

Accuracy:
  • Vertical accuracy typically 15 to 20 cm.
  • Horizontal accuracy 30 to 100 cm.
  • Accuracy improved by flying low and slow, with a narrow FOV.

Spot size:
  • Once the pulse leaves the laser source, based upon the distance between the targets and the source, the width of the laser increases.
  • At the target, it might happen that multiple small objects will be engulfed within same pulse. This gives rise to multiple returns.


Intensity:
  • Strength of returns varies with the composition of the surface object reflecting the return
  • Reflective percentages are referred to as LiDAR intensity
  • Can be used to identify land cover types
  • Intensity values need to be normalized among flights

Reflectivity:
  • A white surface will have better reflectivity than a black surface. 
  • A smooth surface type will have better reflectivity than a rough surface.

Data Processing:
  • Data collected based upon system vendor
  • Data has to be post-processed to calibrate multiple flight lines, filter erroneous values and noise
  • Returns are classified and separated by category: first returns, last (or bare-earth) returns, etc.
  • LIDAR data formats can be ASCII point files or LAS (Log ASCII Standard) format. LAS format is maintained by ASPRS
  • The contents of LAS format are explained below (comparing version 1.0 and 1.1)


Wavelength:

  • 600-1000nm lasers:
        Used for non-scientific purposes but, as they can be focused and easily absorbed by the eye, the maximum power has to be limited to make them 'eye-safe'. 
  • 1550nm lasers: 
        Are a common alternative as they are not focused by the eye and are 'eye-safe' at much higher power levels. These can be used for longer range and lower accuracy purposes. 
        This wavelength does not show under night-vision goggles and are therefore well suited to military applications.
  • 905nm lasers:
        Class 1 laser product and is are 'eye-safe' with 1.3 Watts.

Interference between sensors:
  • The beams from multiple LIDARS in the same orientation would cause interference. 
  • This can be avoided by placing sensors in different orientations.
  • Multiple sensors can have synchronization between them before sending out the pulses to avoid clashing.
Returns per pulse:
  • Laser pulses emitted by LIDAR is reflected back due to objects like vegetation, buildings, roads, bridges etc.
  • Each pulse may return to the LIDAR sensor as one or many returns.
  • The first return is the most significant return. May be due to tallest feature in landscape like canopy or building top.
  • The last return is generally due to bare earth.

Advantages of LIDAR:


  • Day or night operation
  • Efficient acquisition of millions of elevation points per hour
  • Faster coordinate acquisition than traditional methods
  • All digital: no intermediate steps to generate digital XYZ
  • Rapid turnaround: Capable of “overnight” processing. Ability to cover large areas quickly.
  • Captures multiple returns per pulse with intensity information
  • Dense data
  • Accurate: Elevation +/- 10 cm (or better)
  • Airborne: Easy to mobilize and demobilize
  • Non-Intrusive method of survey (airborne) capable of accessing remote areas. Can collect data in steep terrain and shadows.
  • Quicker turnaround, less labour intensive, and lower costs than photogrammetric methods
  • Can produce DEM (Digital Elevation Models) and DSM (Digital Surface Models).



Disadvantages of LIDAR:

  • LIDARs cannot penetrate dense canopy.
  • LIDARs has very large dataset which require high cost to store, process and interpret.
  • LIDAR cannot penetrate cloud cover.
  • Loose performance at bad weather condition (like rain, fog, snow etc.) 

Level of Autonomous - 5 Levels and Role of LIDARs



Sensors in Automotive Vehicles:











Applications of LIDAR in Automotive:

Adaptive Cruise Control (ACC):


  • The LIDARs are used in Adaptive Cruise Control (ACC) for automobiles.
  • The LIDARs are mounted on the cars to monitor the distance between vehicle and any other device.
  • In the event the vehicle in front slows down or is too close, the ACC applies the brakes to slow the vehicle. When the road ahead is clear, the ACC allows the vehicle to accelerate to a speed pre-set by the driver.





Collision Avoidance System:


  • A collision avoidance system is an automobile safety system designed to reduce the severity of a collision
  • It uses radar (all-weather) and sometimes laser (LIDAR) and camera (employing image recognition) to detect an imminent crash.
  • Once the detection is done, these systems either provide a warning to the driver when there is an imminent collision or take action autonomously without any driver input (by braking or steering or both).
  • Collision avoidance by braking is appropriate at low vehicle speeds (e.g. below 50 km/h), while collision avoidance by steering is appropriate at higher vehicle speeds.
  • Volvo introduced the first cyclist detection system. All Volvo automobiles now come standard with a LIDAR laser sensor that monitors the front of the roadway, and if a potential collision is detected, the safety belts will retract to reduce excess slack. 





Pedestrian Recognition:


  • The LIDAR data can be used to determine whether the pedestrians are on road or out of road.
  • The pedestrian recognition and tracking system is integrated with the autonomous vehicle platform which provides timely prediction of pedestrian motions.
  • The data from a camera sensor can be fused with LIDAR data, to enhance detection performance and avoid false alarms.





Automotive Speed Detection and Law Enforcement:


  • LIDAR is being used instead of RADAR since 2000 for Speed limit enforcement. 
  • Current devices are designed to automate the entire process of speed detection, vehicle identification, driver identification and evidentiary documentation
  • Lidar has a narrow beam, and easily targets an individual vehicle, thereby eliminating the need for visual estimation, and records an image of the license plate at the same instant as recording the speed violation.




LIDAR Manufacturing: Automotive TIER1 and Sensor Manufacturers





And the Award goes to!!!!!

  1. http://www.ti.com/general/docs/lit/getliterature.tsp?baseLiteratureNumber=snaa123
  2. http://velodynelidar.com/docs/datasheet/63-9194%20Rev-E_HDL-64E_S3_Spec%20Sheet_Web.pdf
  3. https://www.extremetech.com/extreme/176715-new-optical-chip-will-sharpen-aerial-mapping-and-autonomous-car-vision
  4. http://www.princetonlightwave.com/products/geigercruizer/
  5. http://spectrum.ieee.org/tech-talk/semiconductors/optoelectronics/mit-lidar-on-a-chip
  6. http://www.laserfocusworld.com/articles/2016/09/ford-and-baidu-lead-150m-investment-in-velodyne-lidar-for-self-driving-cars.html
  7. http://www.laserfocusworld.com/articles/print/volume-51/issue-05/features/photonics-applied-transportation-lidar-advances-drive-success-of-autonomous-vehicles.html
  8. http://denso-europe.com/denso-invests-in-semiconductor-laser-startup-trilumina/
  9. http://www.10tv.com/interactive-radar-columbus-ohio
  10. http://desktop.arcgis.com/en/arcmap/10.3/manage-data/las-dataset/types-of-lidar.htm
  11. https://www.e-education.psu.edu/geog481/l1_p4.html
  12. http://www.robotics.org/content-detail.cfm/Industrial-Robotics-Industry-Insights/Intelligent-Robots-A-Feast-for-the-Senses/content_id/5530
  13. http://www.liblas.org/development/format_elements.html
  14. https://www.nae.edu/Publications/Bridge/133842/134343.aspx
  15. http://www.valeo.us/valeo-in-north-america/company-regional-profile/business-groups.html
  16. http://docs.leddartech.com/doc/leddartech-publications/the-automotive-lidar-magazine/2016082601/#2
  17. https://projectmaxcpm.wordpress.com/page/2/
  18. http://www.yole.fr/AutonomousVehicles_Functions.aspx#.WIOlBFN97IU




Ok Bye!!!