Search results for: single source of truth
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8942

Search results for: single source of truth

8882 The Way of Ultimate Realization Through the Buddha’s Realization

Authors: Sujan Barua

Abstract:

Buddhism relies upon natural events which are appeared from the four auto-elements of nature. It has seemed to be an authentic proof of mono-actions that have chronically been existing through our lives circles into the action and reaction that can produce more and more suffering in entire beings. Religion is called such politic through giving up on worthy concerns. Birth, aging, getting sick, lamentation, and death are just a politic of four conditions that depend upon one mind. Those are greed, hatred, and delusion, which are the first fueling to fall into a worthy realm again and again. It is because of having numerous ways of sense faculties, six senses, and five aggregates. These are all defaults of the deluded mind’s conditions and total ignorance covered by not understanding through the emancipating religion. Buddhism is dependent upon the threefold morality, which is the basic politic of giving up birth, aging, getting sick, lamentation, and death. Morality is the primordial theme of reach at ultimate happiness called “Nirvana”. It is a politic of one’s non-understanding ignorance. In Buddhism, the Buddha emphasizes that to understand the politic of the samsara, one must profoundly understand the own action that appears from the threefold ways. One must need authentically verify the own karma and reflection from the self-mind. The worthy concerns are the cause of political suffering to fall in samsara. By avoiding the entire, one can attain ultimate happiness. To attain Nirvana is not like an achievement of worthy happiness and proper understanding of functionality as we comfort in our daily lives. There is no virtue or non-virtual deeds to rebirth, no gripes, no upsetting, no greed, no hatred, no aging, no sickness, no death. It is totally uprooted from 31 types of states of worthy concerns. Nirvana is the stability of ultimate realization, but worthy states are the levels of grasping impurities in life span that make us fall into one clan according to our actions. By profoundly observing, the Buddha crucially founds that the source of rebirth is ignorance. Ignorance drives physical, verbal, and mental, which makes us reborn into the 31 types of realms in cycling existence. It is believed that the best knowledge of how many beings are in this world except the Enlightenment one. The enlightened one knows everything while he thinks about when it is causally necessary for demonstrating someone or verifying the truth of the relational way. It is a political view for entire beings that are chronic because covering by ignorance. It is tremendously toxic, and the person who truly understands this politic of turning here to there is a person who wishes to have eager to find the truth and way to leave those massive toxicities to discover the fixed state of nonexistence. The word non-existence is known as “Suiyata” or emptiness. One can able to find the ultimate truth with the effort of achieving the arch truth of leaving suffering from the cycling system.

Keywords: ultimate realization, nirvana, the easiest way policy to give up worthily concerns, profound understanding of 31 types of cosmology, four noble truths

Procedia PDF Downloads 50
8881 Adapted Intersection over Union: A Generalized Metric for Evaluating Unsupervised Classification Models

Authors: Prajwal Prakash Vasisht, Sharath Rajamurthy, Nishanth Dara

Abstract:

In a supervised machine learning approach, metrics such as precision, accuracy, and coverage can be calculated using ground truth labels to help in model tuning, evaluation, and selection. In an unsupervised setting, however, where the data has no ground truth, there are few interpretable metrics that can guide us to do the same. Our approach creates a framework to adapt the Intersection over Union metric, referred to as Adapted IoU, usually used to evaluate supervised learning models, into the unsupervised domain, which solves the problem by factoring in subject matter expertise and intuition about the ideal output from the model. This metric essentially provides a scale that allows us to compare the performance across numerous unsupervised models or tune hyper-parameters and compare different versions of the same model.

Keywords: general metric, unsupervised learning, classification, intersection over union

Procedia PDF Downloads 25
8880 Development of an Automatic Computational Machine Learning Pipeline to Process Confocal Fluorescence Images for Virtual Cell Generation

Authors: Miguel Contreras, David Long, Will Bachman

Abstract:

Background: Microscopy plays a central role in cell and developmental biology. In particular, fluorescence microscopy can be used to visualize specific cellular components and subsequently quantify their morphology through development of virtual-cell models for study of effects of mechanical forces on cells. However, there are challenges with these imaging experiments, which can make it difficult to quantify cell morphology: inconsistent results, time-consuming and potentially costly protocols, and limitation on number of labels due to spectral overlap. To address these challenges, the objective of this project is to develop an automatic computational machine learning pipeline to predict cellular components morphology for virtual-cell generation based on fluorescence cell membrane confocal z-stacks. Methods: Registered confocal z-stacks of nuclei and cell membrane of endothelial cells, consisting of 20 images each, were obtained from fluorescence confocal microscopy and normalized through software pipeline for each image to have a mean pixel intensity value of 0.5. An open source machine learning algorithm, originally developed to predict fluorescence labels on unlabeled transmitted light microscopy cell images, was trained using this set of normalized z-stacks on a single CPU machine. Through transfer learning, the algorithm used knowledge acquired from its previous training sessions to learn the new task. Once trained, the algorithm was used to predict morphology of nuclei using normalized cell membrane fluorescence images as input. Predictions were compared to the ground truth fluorescence nuclei images. Results: After one week of training, using one cell membrane z-stack (20 images) and corresponding nuclei label, results showed qualitatively good predictions on training set. The algorithm was able to accurately predict nuclei locations as well as shape when fed only fluorescence membrane images. Similar training sessions with improved membrane image quality, including clear lining and shape of the membrane, clearly showing the boundaries of each cell, proportionally improved nuclei predictions, reducing errors relative to ground truth. Discussion: These results show the potential of pre-trained machine learning algorithms to predict cell morphology using relatively small amounts of data and training time, eliminating the need of using multiple labels in immunofluorescence experiments. With further training, the algorithm is expected to predict different labels (e.g., focal-adhesion sites, cytoskeleton), which can be added to the automatic machine learning pipeline for direct input into Principal Component Analysis (PCA) for generation of virtual-cell mechanical models.

Keywords: cell morphology prediction, computational machine learning, fluorescence microscopy, virtual-cell models

Procedia PDF Downloads 181
8879 Medical Image Augmentation Using Spatial Transformations for Convolutional Neural Network

Authors: Trupti Chavan, Ramachandra Guda, Kameshwar Rao

Abstract:

The lack of data is a pain problem in medical image analysis using a convolutional neural network (CNN). This work uses various spatial transformation techniques to address the medical image augmentation issue for knee detection and localization using an enhanced single shot detector (SSD) network. The spatial transforms like a negative, histogram equalization, power law, sharpening, averaging, gaussian blurring, etc. help to generate more samples, serve as pre-processing methods, and highlight the features of interest. The experimentation is done on the OpenKnee dataset which is a collection of knee images from the openly available online sources. The CNN called enhanced single shot detector (SSD) is utilized for the detection and localization of the knee joint from a given X-ray image. It is an enhanced version of the famous SSD network and is modified in such a way that it will reduce the number of prediction boxes at the output side. It consists of a classification network (VGGNET) and an auxiliary detection network. The performance is measured in mean average precision (mAP), and 99.96% mAP is achieved using the proposed enhanced SSD with spatial transformations. It is also seen that the localization boundary is comparatively more refined and closer to the ground truth in spatial augmentation and gives better detection and localization of knee joints.

Keywords: data augmentation, enhanced SSD, knee detection and localization, medical image analysis, openKnee, Spatial transformations

Procedia PDF Downloads 134
8878 Calculation of Detection Efficiency of Horizontal Large Volume Source Using Exvol Code

Authors: M. Y. Kang, Euntaek Yoon, H. D. Choi

Abstract:

To calculate the full energy (FE) absorption peak efficiency for arbitrary volume sample, we developed and verified the EXVol (Efficiency calculator for EXtended Voluminous source) code which is based on effective solid angle method. EXVol is possible to describe the source area as a non-uniform three-dimensional (x, y, z) source. And decompose and set it into several sets of volume units. Users can equally divide (x, y, z) coordinate system to calculate the detection efficiency at a specific position of a cylindrical volume source. By determining the detection efficiency for differential volume units, the total radiative absolute distribution and the correction factor of the detection efficiency can be obtained from the nondestructive measurement of the source. In order to check the performance of the EXVol code, Si ingot of 20 cm in diameter and 50 cm in height were used as a source. The detector was moved at the collimation geometry to calculate the detection efficiency at a specific position and compared with the experimental values. In this study, the performance of the EXVol code was extended to obtain the detection efficiency distribution at a specific position in a large volume source.

Keywords: attenuation, EXVol, detection efficiency, volume source

Procedia PDF Downloads 166
8877 Improvement of Ground Truth Data for Eye Location on Infrared Driver Recordings

Authors: Sorin Valcan, Mihail Gaianu

Abstract:

Labeling is a very costly and time consuming process which aims to generate datasets for training neural networks in several functionalities and projects. For driver monitoring system projects, the need for labeled images has a significant impact on the budget and distribution of effort. This paper presents the modifications done to an algorithm used for the generation of ground truth data for 2D eyes location on infrared images with drivers in order to improve the quality of the data and performance of the trained neural networks. The algorithm restrictions become tougher, which makes it more accurate but also less constant. The resulting dataset becomes smaller and shall not be altered by any kind of manual label adjustment before being used in the neural networks training process. These changes resulted in a much better performance of the trained neural networks.

Keywords: labeling automation, infrared camera, driver monitoring, eye detection, convolutional neural networks

Procedia PDF Downloads 90
8876 Ytterbium Advantages for Brachytherapy

Authors: S. V. Akulinichev, S. A. Chaushansky, V. I. Derzhiev

Abstract:

High dose rate (HDR) brachytherapy is a method of contact radiotherapy, when a single sealed source with an activity of about 10 Ci is temporarily inserted in the tumor area. The isotopes Ir-192 and (much less) Co-60 are used as active material for such sources. The other type of brachytherapy, the low dose rate (LDR) brachytherapy, implies the insertion of many permanent sources (up to 200) of lower activity. The pulse dose rate (PDR) brachytherapy can be considered as a modification of HDR brachytherapy, when the single source is repeatedly introduced in the tumor region in a pulse regime during several hours. The PDR source activity is of the order of one Ci and the isotope Ir-192 is currently used for these sources. The PDR brachytherapy is well recommended for the treatment of several tumors since, according to oncologists, it combines the medical benefits of both HDR and LDR types of brachytherapy. One of the main problems for the PDR brachytherapy progress is the shielding of the treatment area since the longer stay of patients in a shielded canyon is not enough comfortable for them. The use of Yb-169 as an active source material is the way to resolve the shielding problem for PDR, as well as for HRD brachytherapy. The isotope Yb-169 has the average photon emission energy of 93 KeV and the half-life of 32 days. Compared to iridium and cobalt, this isotope has a significantly lower emission energy and therefore requires a much lighter shielding. Moreover, the absorption cross section of different materials has a strong Z-dependence in that photon energy range. For example, the dose distributions of iridium and ytterbium have a quite similar behavior in the water or in the body. But the heavier material as lead absorbs the ytterbium radiation much stronger than the iridium or cobalt radiation. For example, only 2 mm of lead layer is enough to reduce the ytterbium radiation by a couple of orders of magnitude but is not enough to protect from iridium radiation. We have created an original facility to produce the start stable isotope Yb-168 using the laser technology AVLIS. This facility allows to raise the Yb-168 concentration up to 50 % and consumes much less of electrical power than the alternative electromagnetic enrichment facilities. We also developed, in cooperation with the Institute of high pressure physics of RAS, a new technology for manufacturing high-density ceramic cores of ytterbium oxide. Ceramics density reaches the limit of the theoretical values: 9.1 g/cm3 for the cubic phase of ytterbium oxide and 10 g/cm3 for the monoclinic phase. Source cores from this ceramics have high mechanical characteristics and a glassy surface. The use of ceramics allows to increase the source activity with fixed external dimensions of sources.

Keywords: brachytherapy, high, pulse dose rates, radionuclides for therapy, ytterbium sources

Procedia PDF Downloads 470
8875 Smartphones as a Tool of Mobile Journalism in Saudi Arabia

Authors: Ahmed Deen

Abstract:

The introduction of the mobile devices which were equipped with internet access and a camera, as well as the messaging services, has become a major inspiration for the use of the mobile devices in the growth in the reporting of news. Mobile journalism (MOJO) was a creation of modern technology, especially the use of mobile technology for video journalism purposes. MOJO, thus, is the process by which information is collected and disseminated to society, through the use of mobile technology, and even the use of the tablets. This paper seeks to better understand the ethics of Saudi mobile journalists towards news coverage. Also, this study aims to explore the relationship between minimizing harms and truth-seeking efforts among Saudi mobile journalists. Three main ethics were targeted in this study, which are seek truth and report it, minimize harm, and being accountable. Diffusion of innovation theory applied to reach this study’s goals. The non- probability sampling approach, ‘Snowball Sampling’ was used to target 124 survey participants, an online survey via SurveyMonkey that was distributed through social media platforms as a web link. The code of ethics of the Society of Professional Journalists has applied as a scale in this study. This study found that the relationship between minimizing harm and truth-seeking efforts is significantly moderate among Saudi mobile journalists. Also, it is found that the level journalistic experiences and using smartphones to cover news are weakly and negatively related to the perceptions of mobile journalism among Saudi journalists, while Saudi journalists who use their smartphone to cover the news between 1-3 years, were the majority of participants (55 participants by 51.4%).

Keywords: mobile journalism, Saudi journalism, smartphone, Saudi Arabia

Procedia PDF Downloads 148
8874 Analysis of Joint Source Channel LDPC Coding for Correlated Sources Transmission over Noisy Channels

Authors: Marwa Ben Abdessalem, Amin Zribi, Ammar Bouallègue

Abstract:

In this paper, a Joint Source Channel coding scheme based on LDPC codes is investigated. We consider two concatenated LDPC codes, one allows to compress a correlated source and the second to protect it against channel degradations. The original information can be reconstructed at the receiver by a joint decoder, where the source decoder and the channel decoder run in parallel by transferring extrinsic information. We investigate the performance of the JSC LDPC code in terms of Bit-Error Rate (BER) in the case of transmission over an Additive White Gaussian Noise (AWGN) channel, and for different source and channel rate parameters. We emphasize how JSC LDPC presents a performance tradeoff depending on the channel state and on the source correlation. We show that, the JSC LDPC is an efficient solution for a relatively low Signal-to-Noise Ratio (SNR) channel, especially with highly correlated sources. Finally, a source-channel rate optimization has to be applied to guarantee the best JSC LDPC system performance for a given channel.

Keywords: AWGN channel, belief propagation, joint source channel coding, LDPC codes

Procedia PDF Downloads 339
8873 Topological Sensitivity Analysis for Reconstruction of the Inverse Source Problem from Boundary Measurement

Authors: Maatoug Hassine, Mourad Hrizi

Abstract:

In this paper, we consider a geometric inverse source problem for the heat equation with Dirichlet and Neumann boundary data. We will reconstruct the exact form of the unknown source term from additional boundary conditions. Our motivation is to detect the location, the size and the shape of source support. We present a one-shot algorithm based on the Kohn-Vogelius formulation and the topological gradient method. The geometric inverse source problem is formulated as a topology optimization one. A topological sensitivity analysis is derived from a source function. Then, we present a non-iterative numerical method for the geometric reconstruction of the source term with unknown support using a level curve of the topological gradient. Finally, we give several examples to show the viability of our presented method.

Keywords: geometric inverse source problem, heat equation, topological optimization, topological sensitivity, Kohn-Vogelius formulation

Procedia PDF Downloads 281
8872 Modelling and Simulation of Cascaded H-Bridge Multilevel Single Source Inverter Using PSIM

Authors: Gaddafi Sani Shehu, Tankut Yalcınoz, Abdullahi Bala Kunya

Abstract:

Multilevel inverters such as flying capacitor, diode-clamped, and cascaded H-bridge inverters are very popular particularly in medium and high power applications. This paper focuses on a cascaded H-bridge module using a single direct current (DC) source in order to generate an 11-level output voltage. The noble approach reduces the number of switches and gate drivers, in comparison with a conventional method. The anticipated topology produces more accurate result with an isolation transformer at high switching frequency. Different modulation techniques can be used for the multilevel inverter, but this work features modulation techniques known as selective harmonic elimination (SHE).This modulation approach reduces the number of carriers with reduction in Switching Losses, Total Harmonic Distortion (THD), and thereby increasing Power Quality (PQ). Based on the simulation result obtained, it appears SHE has the ability to eliminate selected harmonics by chopping off the fundamental output component. The performance evaluation of the proposed cascaded multilevel inverter is performed using PSIM simulation package and THD of 0.94% is obtained.

Keywords: cascaded H-bridge multilevel inverter, power quality, selective harmonic elimination

Procedia PDF Downloads 397
8871 Exploration of a Blockchain Assisted Framework for Through Baggage Interlining: Blocklining

Authors: Mary Rose Everan, Michael McCann, Gary Cullen

Abstract:

International travel journeys, by their nature, incorporate elements provided by multiple service providers such as airlines, rail carriers, airports, and ground handlers. Data needs to be stored by and exchanged between these parties in the process of managing the journey. The fragmented nature of this shared management of mutual clients is a limiting factor in the development of a seamless, hassle-free, end-to-end travel experience. Traditional interlining agreements attempt to facilitate many separate aspects of co-operation between service providers, typically between airlines and, to some extent, intermodal travel operators, including schedules, fares, ticketing, through check-in, and baggage handling. These arrangements rely on pre-agreement. The development of Virtual Interlining - that is, interlining facilitated by a third party (often but not always an airport) without formal pre-agreement by the airlines or rail carriers - demonstrates an underlying demand for a better quality end-to-end travel experience. Blockchain solutions are being explored in a number of industries and offer, at first sight, an immutable, single source of truth for this data, avoiding data conflicts and misinterpretation. Combined with Smart Contracts, they seemingly offer a more robust and dynamic platform for multi-stakeholder ventures, and even perhaps the ability to join and leave consortia dynamically. Applying blockchain to the intermodal interlining space – termed Blocklining in this paper - is complex and multi-faceted because of the many aspects of cooperation outlined above. To explore its potential, this paper concentrates on one particular dimension, that of through baggage interlining.

Keywords: aviation, baggage, blocklining, intermodal, interlining

Procedia PDF Downloads 130
8870 A Robust Optimization Model for the Single-Depot Capacitated Location-Routing Problem

Authors: Abdolsalam Ghaderi

Abstract:

In this paper, the single-depot capacitated location-routing problem under uncertainty is presented. The problem aims to find the optimal location of a single depot and the routing of vehicles to serve the customers when the parameters may change under different circumstances. This problem has many applications, especially in the area of supply chain management and distribution systems. To get closer to real-world situations, travel time of vehicles, the fixed cost of vehicles usage and customers’ demand are considered as a source of uncertainty. A combined approach including robust optimization and stochastic programming was presented to deal with the uncertainty in the problem at hand. For this purpose, a mixed integer programming model is developed and a heuristic algorithm based on Variable Neighborhood Search(VNS) is presented to solve the model. Finally, the computational results are presented and future research directions are discussed.

Keywords: location-routing problem, robust optimization, stochastic programming, variable neighborhood search

Procedia PDF Downloads 250
8869 DNA as an Instrument in Constructing Narratives and Justice in Criminal Investigations: A Socio-Epistemological Exploration

Authors: Aadita Chaudhury

Abstract:

Since at least the early 2000s, DNA profiling has achieved a preeminent status in forensic investigations into criminal acts. While the criminal justice system has a long history of using forensic evidence and testing them through establish technoscientific means, the primacy of DNA in establishing 'truth' or reconstructing a series of events is unparalleled in the history of forensic science. This paper seeks to elucidate the ways in which DNA profiling has become the most authoritative instrument of 'truth' in criminal investigations, and how it is used in the legal process to ascertain culpability, create the notion of infallible evidence, and advance the search for justice. It is argued that DNA profiling has created a paradigm shift in how the legal system and the general public understands crime and culpability, but not without limitations. There are indications that even trace amounts of DNA evidence can point to causal links in a criminal investigation, however, there still remains many rooms to create confusion and doubt from empirical evidence within the narrative of crimes. Many of the shortcomings of DNA-based forensic investigations are explored and evaluated with regards to claims of the authority of biological evidence and implications for the public understanding of the elusive concepts of truth and justice in the present era. Public misinformation about the forensic analysis processes could produce doubt or faith in the judgements rooted in them, depending on other variables presented at the trial. A positivist understanding of forensic science that is shared by the majority of the population does not take into consideration that DNA evidence is far from definitive, and can be used to support any theories of culpability, to create doubt and to deflect blame.

Keywords: DNA profiling, epistemology of forensic science, philosophy of forensic science, sociology of scientific knowledge

Procedia PDF Downloads 182
8868 Performance Analysis of Absorption Power Cycle under Different Source Temperatures

Authors: Kyoung Hoon Kim

Abstract:

The absorption power generation cycle based on the ammonia-water mixture has attracted much attention for efficient recovery of low-grade energy sources. In this paper, a thermodynamic performance analysis is carried out for a Kalina cycle using ammonia-water mixture as a working fluid for efficient conversion of low-temperature heat source in the form of sensible energy. The effects of the source temperature on the system performance are extensively investigated by using the thermodynamic models. The results show that the source temperature as well as the ammonia mass fraction affects greatly on the thermodynamic performance of the cycle.

Keywords: ammonia-water mixture, Kalina cycle, low-grade heat source, source temperature

Procedia PDF Downloads 438
8867 Anxiety Caused by the Single Mode of Instruction in Multilingual Classrooms: The Case of African Language Learners

Authors: Stanle Madonsela

Abstract:

For learning to take place effectively, learners have to use language. Language becomes a critical tool by which to communicate, to express feelings, desires and thoughts, and most of all to learn. However, each individual’s capacity to use language is unique. In multilingual countries, classrooms usually comprise learners from different language backgrounds, and therefore the language used for teaching and learning requires rethinking. Interaction in the classroom, if done in a language that is understood by the learners, could maximise the outcomes of learning. This paper explores the extent to which the use of a single code becomes a source of anxiety to learners in multilingual classrooms in South African schools. It contends that a multilingual approach in the learning process should be explored in order to promote learner autonomy in the learning process.

Keywords: anxiety, classroom, foreign language teaching, multilingual

Procedia PDF Downloads 504
8866 Requirement Engineering Within Open Source Software Development: A Case Study

Authors: Kars Beek, Remco Groeneveld, Sjaak Brinkkemper

Abstract:

Although there is much literature available on requirement documentation in traditional software development, few studies have been conducted about this topic in open source software development. While open-source software development is becoming more important, the software development processes are often not as structured as corporate software development processes. Papers show that communities, creating open-source software, often lack structure and documentation. However, most recent studies about this topic are often ten or more years old. Therefore, this research has been conducted to determine if the lack of structure and documentation in requirement engineering is currently still the situation in these communities. Three open-source products have been chosen as subjects for conducting this research. The data for this research was gathered based on interviews, observations, and analyses of feature proposals and issue tracking tools. In this paper, we present a comparison and an analysis of the different methods used for requirements documentation to understand the current practices of requirements documentation in open source software development.

Keywords: case study, open source software, open source software development, requirement elicitation, requirement engineering

Procedia PDF Downloads 76
8865 Flexible and Color Tunable Inorganic Light Emitting Diode Array for High Resolution Optogenetic Devices

Authors: Keundong Lee, Dongha Yoo, Youngbin Tchoe, Gyu-Chul Yi

Abstract:

Light emitting diode (LED) array is an ideal optical stimulation tool for optogenetics, which controls inhibition and excitation of specific neurons with light-sensitive ion channels or pumps. Although a fiber-optic cable with an external light source, either a laser or LED mechanically connected to the end of the fiber-optic cable has widely been used for illumination on neural tissue, a new approach to use micro LEDs (µLEDs) has recently been demonstrated. The LEDs can be placed directly either on the cortical surface or within the deep brain using a penetrating depth probe. Accordingly, this method would not need a permanent opening in the skull if the LEDs are integrated with miniature electrical power source and wireless communication. In addition, multiple color generation from single µLED cell would enable to excite and/or inhibit neurons in localized regions. Here, we demonstrate flexible and color tunable µLEDs for the optogenetic device applications. The flexible and color tunable LEDs was fabricated using multifaceted gallium nitride (GaN) nanorod arrays with GaN nanorods grown on InxGa1−xN/GaN single quantum well structures (SQW) anisotropically formed on the nanorod tips and sidewalls. For various electroluminescence (EL) colors, current injection paths were controlled through a continuous p-GaN layer depending on the applied bias voltage. The electric current was injected through different thickness and composition, thus changing the color of light from red to blue that the LED emits. We believe that the flexible and color tunable µLEDs enable us to control activities of the neuron by emitting various colors from the single µLED cell.

Keywords: light emitting diode, optogenetics, graphene, flexible optoelectronics

Procedia PDF Downloads 198
8864 UNIX Source Code Leak: Evaluation and Feasible Solutions

Authors: Gu Dongxing, Li Yuxuan, Nong Tengxiao, Burra Venkata Durga Kumar

Abstract:

Since computers are widely used in business models, more and more companies choose to store important information in computers to improve productivity. However, this information can be compromised in many cases, such as when it is stored locally on the company's computers or when it is transferred between servers and clients. Of these important information leaks, source code leaks are probably the most costly. Because the source code often represents the core technology of the company, especially for the Internet companies, source code leakage may even lead to the company's core products lose market competitiveness, and then lead to the bankruptcy of the company. In recent years, such as Microsoft, AMD and other large companies have occurred source code leakage events, suffered a huge loss. This reveals to us the importance and necessity of preventing source code leakage. This paper aims to find ways to prevent source code leakage based on the direction of operating system, and based on the fact that most companies use Linux or Linux-like system to realize the interconnection between server and client, to discuss how to reduce the possibility of source code leakage during data transmission.

Keywords: data transmission, Linux, source code, operating system

Procedia PDF Downloads 237
8863 The Analysis of Deceptive and Truthful Speech: A Computational Linguistic Based Method

Authors: Seham El Kareh, Miramar Etman

Abstract:

Recently, detecting liars and extracting features which distinguish them from truth-tellers have been the focus of a wide range of disciplines. To the author’s best knowledge, most of the work has been done on facial expressions and body gestures but only few works have been done on the language used by both liars and truth-tellers. This paper sheds light on four axes. The first axis copes with building an audio corpus for deceptive and truthful speech for Egyptian Arabic speakers. The second axis focuses on examining the human perception of lies and proving our need for computational linguistic-based methods to extract features which characterize truthful and deceptive speech. The third axis is concerned with building a linguistic analysis program that could extract from the corpus the inter- and intra-linguistic cues for deceptive and truthful speech. The program built here is based on selected categories from the Linguistic Inquiry and Word Count program. Our results demonstrated that Egyptian Arabic speakers on one hand preferred to use first-person pronouns and present tense compared to the past tense when lying and their lies lacked of second-person pronouns, and on the other hand, when telling the truth, they preferred to use the verbs related to motion and the nouns related to time. The results also showed that there is a need for bigger data to prove the significance of words related to emotions and numbers.

Keywords: Egyptian Arabic corpus, computational analysis, deceptive features, forensic linguistics, human perception, truthful features

Procedia PDF Downloads 187
8862 Replacing MOSFETs with Single Electron Transistors (SET) to Reduce Power Consumption of an Inverter Circuit

Authors: Ahmed Shariful Alam, Abu Hena M. Mustafa Kamal, M. Abdul Rahman, M. Nasmus Sakib Khan Shabbir, Atiqul Islam

Abstract:

According to the rules of quantum mechanics there is a non-vanishing probability of for an electron to tunnel through a thin insulating barrier or a thin capacitor which is not possible according to the laws of classical physics. Tunneling of electron through a thin insulating barrier or tunnel junction is a random event and the magnitude of current flowing due to the tunneling of electron is very low. As the current flowing through a Single Electron Transistor (SET) is the result of electron tunneling through tunnel junctions of its source and drain the supply voltage requirement is also very low. As a result, the power consumption across a Single Electron Transistor is ultra-low in comparison to that of a MOSFET. In this paper simulations have been done with PSPICE for an inverter built with both SETs and MOSFETs. 35mV supply voltage was used for a SET built inverter circuit and the supply voltage used for a CMOS inverter was 3.5V.

Keywords: ITRS, enhancement type MOSFET, island, DC analysis, transient analysis, power consumption, background charge co-tunneling

Procedia PDF Downloads 505
8861 Study on Acoustic Source Detection Performance Improvement of Microphone Array Installed on Drones Using Blind Source Separation

Authors: Youngsun Moon, Yeong-Ju Go, Jong-Soo Choi

Abstract:

Most drones that currently have surveillance/reconnaissance missions are basically equipped with optical equipment, but we also need to use a microphone array to estimate the location of the acoustic source. This can provide additional information in the absence of optical equipment. The purpose of this study is to estimate Direction of Arrival (DOA) based on Time Difference of Arrival (TDOA) estimation of the acoustic source in the drone. The problem is that it is impossible to measure the clear target acoustic source because of the drone noise. To overcome this problem is to separate the drone noise and the target acoustic source using Blind Source Separation(BSS) based on Independent Component Analysis(ICA). ICA can be performed assuming that the drone noise and target acoustic source are independent and each signal has non-gaussianity. For maximized non-gaussianity each signal, we use Negentropy and Kurtosis based on probability theory. As a result, we can improve TDOA estimation and DOA estimation of the target source in the noisy environment. We simulated the performance of the DOA algorithm applying BSS algorithm, and demonstrated the simulation through experiment at the anechoic wind tunnel.

Keywords: aeroacoustics, acoustic source detection, time difference of arrival, direction of arrival, blind source separation, independent component analysis, drone

Procedia PDF Downloads 143
8860 Single-Cell Visualization with Minimum Volume Embedding

Authors: Zhenqiu Liu

Abstract:

Visualizing the heterogeneity within cell-populations for single-cell RNA-seq data is crucial for studying the functional diversity of a cell. However, because of the high level of noises, outlier, and dropouts, it is very challenging to measure the cell-to-cell similarity (distance), visualize and cluster the data in a low-dimension. Minimum volume embedding (MVE) projects the data into a lower-dimensional space and is a promising tool for data visualization. However, it is computationally inefficient to solve a semi-definite programming (SDP) when the sample size is large. Therefore, it is not applicable to single-cell RNA-seq data with thousands of samples. In this paper, we develop an efficient algorithm with an accelerated proximal gradient method and visualize the single-cell RNA-seq data efficiently. We demonstrate that the proposed approach separates known subpopulations more accurately in single-cell data sets than other existing dimension reduction methods.

Keywords: single-cell RNA-seq, minimum volume embedding, visualization, accelerated proximal gradient method

Procedia PDF Downloads 208
8859 Rapid Monitoring of Earthquake Damages Using Optical and SAR Data

Authors: Saeid Gharechelou, Ryutaro Tateishi

Abstract:

Earthquake is an inevitable catastrophic natural disaster. The damages of buildings and man-made structures, where most of the human activities occur are the major cause of casualties from earthquakes. A comparison of optical and SAR data is presented in the case of Kathmandu valley which was hardly shaken by 2015-Nepal Earthquake. Though many existing researchers have conducted optical data based estimated or suggested combined use of optical and SAR data for improved accuracy, however finding cloud-free optical images when urgently needed are not assured. Therefore, this research is specializd in developing SAR based technique with the target of rapid and accurate geospatial reporting. Should considers that limited time available in post-disaster situation offering quick computation exclusively based on two pairs of pre-seismic and co-seismic single look complex (SLC) images. The InSAR coherence pre-seismic, co-seismic and post-seismic was used to detect the change in damaged area. In addition, the ground truth data from field applied to optical data by random forest classification for detection of damaged area. The ground truth data collected in the field were used to assess the accuracy of supervised classification approach. Though a higher accuracy obtained from the optical data then integration by optical-SAR data. Limitation of cloud-free images when urgently needed for earthquak evevent are and is not assured, thus further research on improving the SAR based damage detection is suggested. Availability of very accurate damage information is expected for channelling the rescue and emergency operations. It is expected that the quick reporting of the post-disaster damage situation quantified by the rapid earthquake assessment should assist in channeling the rescue and emergency operations, and in informing the public about the scale of damage.

Keywords: Sentinel-1A data, Landsat-8, earthquake damage, InSAR, rapid damage monitoring, 2015-Nepal earthquake

Procedia PDF Downloads 147
8858 Numerical Modeling the Cavitating Flow in Injection Nozzle Holes

Authors: Ridha Zgolli, Hatem Kanfoudi

Abstract:

Cavitating flows inside a diesel injection nozzle hole were simulated using a mixture model. A 2D numerical model is proposed in this paper to simulate steady cavitating flows. The Reynolds-averaged Navier-Stokes equations are solved for the liquid and vapor mixture, which is considered as a single fluid with variable density which is expressed as function of the vapor volume fraction. The closure of this variable is provided by the transport equation with a source term TEM. The processes of evaporation and condensation are governed by changes in pressure within the flow. The source term is implanted in the CFD code ANSYS CFX. The influence of numerical and physical parameters is presented in details. The numerical simulations are in good agreement with the experimental data for steady flow.

Keywords: cavitation, injection nozzle, numerical simulation, k–ω

Procedia PDF Downloads 376
8857 A Research Analysis on the Source Technology and Convergence Types

Authors: Kwounghee Choi

Abstract:

Technological convergence between the various sectors is expected to have a very large impact on future industrial and economy. This study attempts to do empirical approach between specific technologies’ classification. For technological convergence classification, it is necessary to set the target technology to be analyzed. This study selected target technology from national research and development plan. At first we found a source technology for analysis. Depending on the weight of source technology, NT-based, BT-based, IT-based, ET-based, CS-based convergence types were classified. This study aims to empirically show the concept of convergence technology and convergence types. If we use the source technology to classify convergence type, it will be useful to make practical strategies of convergence technology.

Keywords: technology convergence, source technology, convergence type, R&D strategy, technology classification

Procedia PDF Downloads 459
8856 Number of Parameters of Anantharam's Model with Single-Input Single-Output Case

Authors: Kazuyoshi Mori

Abstract:

In this paper, we consider the parametrization of Anantharam’s model within the framework of the factorization approach. In the parametrization, we investigate the number of required parameters of Anantharam’s model. We consider single-input single-output systems in this paper. By the investigation, we find three cases that are (1) there exist plants which require only one parameter and (2) two parameters, and (3) the number of parameters is at most three.

Keywords: linear systems, parametrization, coprime factorization, number of parameters

Procedia PDF Downloads 193
8855 Simulation-Based Unmanned Surface Vehicle Design Using PX4 and Robot Operating System With Kubernetes and Cloud-Native Tooling

Authors: Norbert Szulc, Jakub Wilk, Franciszek Górski

Abstract:

This paper presents an approach for simulating and testing robotic systems based on PX4, using a local Kubernetes cluster. The approach leverages modern cloud-native tools and runs on single-board computers. Additionally, this solution enables the creation of datasets for computer vision and the evaluation of control system algorithms in an end-to-end manner. This paper compares this approach to method commonly used Docker based approach. This approach was used to develop simulation environment for an unmanned surface vehicle (USV) for RoboBoat 2023 by running a containerized configuration of the PX4 Open-source Autopilot connected to ROS and the Gazebo simulation environment.

Keywords: cloud computing, Kubernetes, single board computers, simulation, ROS

Procedia PDF Downloads 56
8854 Electrodermal Activity Measurement Using Constant Current AC Source

Authors: Cristian Chacha, David Asiain, Jesús Ponce de León, José Ramón Beltrán

Abstract:

This work explores and characterizes the behavior of the AFE AD5941 in impedance measurement using an embedded algorithm with a constant current AC source. The main aim of this research is to improve the exact measurement of impedance values for their application in EDA-focused wearable devices. Through comprehensive study and characterization, it has been observed that employing a measurement sequence with a constant current source produces results with increased dispersion but higher accuracy. As a result, this approach leads to a more accurate system for impedance measurement.

Keywords: EDA, constant current AC source, wearable, precision, accuracy, impedance

Procedia PDF Downloads 80
8853 Cognitive Methods for Detecting Deception During the Criminal Investigation Process

Authors: Laid Fekih

Abstract:

Background: It is difficult to detect lying, deception, and misrepresentation just by looking at verbal or non-verbal expression during the criminal investigation process, as there is a common belief that it is possible to tell whether a person is lying or telling the truth just by looking at the way they act or behave. The process of detecting lies and deception during the criminal investigation process needs more studies and research to overcome the difficulties facing the investigators. Method: The present study aimed to identify the effectiveness of cognitive methods and techniques in detecting deception during the criminal investigation. It adopted the quasi-experimental method and covered a sample of (20) defendants distributed randomly into two homogeneous groups, an experimental group of (10) defendants be subject to criminal investigation by applying cognitive techniques to detect deception and a second experimental group of (10) defendants be subject to the direct investigation method. The tool that used is a guided interview based on models of investigative questions according to the cognitive deception detection approach, which consists of three techniques of Vrij: imposing the cognitive burden, encouragement to provide more information, and ask unexpected questions, and the Direct Investigation Method. Results: Results revealed a significant difference between the two groups in term of lie detection accuracy in favour of defendants be subject to criminal investigation by applying cognitive techniques, the cognitive deception detection approach produced superior total accuracy rates both with human observers and through an analysis of objective criteria. The cognitive deception detection approach produced superior accuracy results in truth detection: 71%, deception detection: 70% compared to a direct investigation method truth detection: 52%; deception detection: 49%. Conclusion: The study recommended if practitioners use a cognitive deception detection technique, they will correctly classify more individuals than when they use a direct investigation method.

Keywords: the cognitive lie detection approach, deception, criminal investigation, mental health

Procedia PDF Downloads 48