Search results for: applications of big data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29091

Search results for: applications of big data

28101 The Digital Unconscious: Exploring AI Potential to Decode the Human Subconscious

Authors: Khader I. Alkhouri

Abstract:

This paper explores the emerging intersection of artificial intelligence (AI) and subconscious research, examining how AI technologies may revolutionize our understanding of the human mind. We review key AI techniques being applied to decode subconscious processes, discuss potential applications and breakthroughs, and consider the ethical implications and societal impacts of this rapidly advancing field. By leveraging AI's powerful pattern recognition and data analysis capabilities, researchers aim to gain unprecedented insights into implicit memory, unconscious bias, and automatic behaviors. While promising, this research also raises important questions about cognitive privacy and the responsible development of these technologies.

Keywords: artificial intelligence, machine learning, neuroethics, psychological research, subconscious

Procedia PDF Downloads 10
28100 Improved Hash Value Based Stream CipherUsing Delayed Feedback with Carry Shift Register

Authors: K. K. Soundra Pandian, Bhupendra Gupta

Abstract:

In the modern era, as the application data’s are massive and complex, it needs to be secured from the adversary attack. In this context, a non-recursive key based integrated spritz stream cipher with the circulant hash function using delayed feedback with carry shift register (d-FCSR) is proposed in this paper. The novelty of this proposed stream cipher algorithm is to engender the improved keystream using d-FCSR. The proposed algorithm is coded using Verilog HDL to produce dynamic binary key stream and implemented on commercially available FPGA device Virtex 5 xc5vlx110t-2ff1136. The implementation of stream cipher using d-FCSR on the FPGA device operates at a maximum frequency of 60.62 MHz. It achieved the data throughput of 492 Mbps and improved in terms of efficiency (throughput/area) compared to existing techniques. This paper also briefs the cryptanalysis of proposed circulant hash value based spritz stream cipher using d-FCSR is against the adversary attack on a hardware platform for the hardware based cryptography applications.

Keywords: cryptography, circulant function, field programmable gated array, hash value, spritz stream cipher

Procedia PDF Downloads 234
28099 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 384
28098 Mobile App Architecture in 2023: Build Your Own Mobile App

Authors: Mounir Filali

Abstract:

Companies use many innovative ways to reach their customers to stay ahead of the competition. Along with the growing demand for innovative business solutions is the demand for new technology. The most noticeable area of demand for business innovations is the mobile application industry. Recently, companies have recognized the growing need to integrate proprietary mobile applications into their suite of services; Companies have realized that developing mobile apps gives them a competitive edge. As a result, many have begun to rapidly develop mobile apps to stay ahead of the competition. Mobile application development helps companies meet the needs of their customers. Mobile apps also help businesses to take advantage of every potential opportunity to generate leads that convert into sales. Mobile app download growth statistics with the recent rise in demand for business-related mobile apps, there has been a similar rise in the range of mobile app solutions being offered. Today, companies can use the traditional route of the software development team to build their own mobile applications. However, there are also many platform-ready "low-code and no-code" mobile apps available to choose from. These mobile app development options have more streamlined business processes. This helps them be more responsive to their customers without having to be coding experts. Companies must have a basic understanding of mobile app architecture to attract and maintain the interest of mobile app users. Mobile application architecture refers to the buildings or structural systems and design elements that make up a mobile application. It also includes the technologies, processes, and components used during application development. The underlying foundation of all applications consists of all elements of the mobile application architecture, developing a good mobile app architecture requires proper planning and strategic design. The technology framework or platform on the back end and user-facing side of a mobile application is part of the mobile architecture of the application. In-application development Software programmers loosely refer to this set of mobile architecture systems and processes as the "technology stack".

Keywords: mobile applications, development, architecture, technology

Procedia PDF Downloads 76
28097 Estimating Leaf Area and Biomass of Wheat Using UAS Multispectral Remote Sensing

Authors: Jackson Parker Galvan, Wenxuan Guo

Abstract:

Unmanned aerial vehicle (UAV) technology is being increasingly adopted in high-throughput plant phenotyping for applications in plant breeding and precision agriculture. Winter wheat is an important cover crop for reducing soil erosion and protecting the environment in the Southern High Plains. Efficiently quantifying plant leaf area and biomass provides critical information for producers to practice site-specific management of crop inputs, such as water and fertilizers. The objective of this study was to estimate wheat biomass and leaf area index using UAV images. This study was conducted in an irrigated field in Garza County, Texas. High-resolution images were acquired on three dates (February 18, March 25, and May 15th ) using a multispectral sensor onboard a Matrice 600 UAV. On each data of image acquisition, 10 random plant samples were collected and measured for biomass and leaf area. Images were stitched using Pix4D, and ArcGIS was applied to overlay sampling locations and derive data for sampling locations.

Keywords: precision agriculture, UAV plant phenotyping, biomass, leaf area index, winter wheat, southern high plains

Procedia PDF Downloads 82
28096 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500

Authors: Mustafa Elfituri, Jonathan Cook

Abstract:

Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.

Keywords: graph computation, graph500 benchmark, parallel architectures, parallel programming, workload characterization.

Procedia PDF Downloads 127
28095 Cross Project Software Fault Prediction at Design Phase

Authors: Pradeep Singh, Shrish Verma

Abstract:

Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.

Keywords: software metrics, fault prediction, cross project, within project.

Procedia PDF Downloads 324
28094 Rare-Earth Ions Doped Lithium Niobate Crystals: Luminescence and Raman Spectroscopy

Authors: Ninel Kokanyan, Edvard Kokanyan, Anush Movsesyan, Marc D. Fontana

Abstract:

Lithium Niobate (LN) is one of the widely used ferroelectrics having a wide number of applications such as phase-conjugation, holographic storage, frequency doubling, SAW sensors. Furthermore, the possibility of doping with rare-earth ions leads to new laser applications. Ho and Tm dopants seem interesting due to laser emission obtained at around 2 µm. Raman spectroscopy is a powerful spectroscopic technique providing a possibility to obtain a number of information about physicochemical and also optical properties of a given material. Polarized Raman measurements were carried out on Ho and Tm doped LN crystals with excitation wavelengths of 532nm and 785nm. In obtained Raman anti-Stokes spectra, we detect expected modes according to Raman selection rules. In contrast, Raman Stokes spectra are significantly different compared to what is expected by selection rules. Additional forbidden lines are detected. These lines have quite high intensity and are well defined. Moreover, the intensity of mentioned additional lines increases with an increase of Ho or Tm concentrations in the crystal. These additional lines are attributed to emission lines reflecting the photoluminescence spectra of these crystals. It means that in our case we were able to detect, within a very good resolution, in the same Stokes spectrum, the transitions between the electronic states, and the vibrational states as well. The analysis of these data is reported as a function of Ho and Tm content, for different polarizations and wavelengths, of the incident laser beam. Results also highlight additional information about π and σ polarizations of crystals under study.

Keywords: lithium niobate, Raman spectroscopy, luminescence, rare-earth ions doped lithium niobate

Procedia PDF Downloads 203
28093 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features

Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova

Abstract:

The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.

Keywords: emotion recognition, facial recognition, signal processing, machine learning

Procedia PDF Downloads 300
28092 Cryptosystems in Asymmetric Cryptography for Securing Data on Cloud at Various Critical Levels

Authors: Sartaj Singh, Amar Singh, Ashok Sharma, Sandeep Kaur

Abstract:

With upcoming threats in a digital world, we need to work continuously in the area of security in all aspects, from hardware to software as well as data modelling. The rise in social media activities and hunger for data by various entities leads to cybercrime and more attack on the privacy and security of persons. Cryptography has always been employed to avoid access to important data by using many processes. Symmetric key and asymmetric key cryptography have been used for keeping data secrets at rest as well in transmission mode. Various cryptosystems have evolved from time to time to make the data more secure. In this research article, we are studying various cryptosystems in asymmetric cryptography and their application with usefulness, and much emphasis is given to Elliptic curve cryptography involving algebraic mathematics.

Keywords: cryptography, symmetric key cryptography, asymmetric key cryptography

Procedia PDF Downloads 101
28091 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process

Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum

Abstract:

Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.

Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact

Procedia PDF Downloads 182
28090 Theoretical Investigation of Structural and Electronic Properties of AlBi

Authors: S. Louhibi-Fasla, H. Achour, B. Amrani

Abstract:

The purpose of this work is to provide some additional information to the existing data on the physical properties of AlBi with state-of-the-art first-principles method of the full potential linear augmented plane wave (FPLAPW). Additionally to the structural properties, the electronic properties have also been investigated. The dependence of the volume, the bulk modulus, the variation of the thermal expansion α, as well as the Debye temperature are successfully obtained in the whole range from 0 to 30 GPa and temperature range from 0 to 1200 K. The latter are the basis of solid-state science and industrial applications and their study is of importance to extend our knowledge on their specific behaviour when undergoing severe constraints of high pressure and high temperature environments.

Keywords: AlBi, FP-LAPW, structural properties, electronic properties

Procedia PDF Downloads 366
28089 Data Recording for Remote Monitoring of Autonomous Vehicles

Authors: Rong-Terng Juang

Abstract:

Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.

Keywords: autonomous vehicle, data compression, remote monitoring, controller area networks (CAN), Lidar

Procedia PDF Downloads 145
28088 A Deep Learning Approach for Optimum Shape Design

Authors: Cahit Perkgöz

Abstract:

Artificial intelligence has brought new approaches to solving problems in almost every research field in recent years. One of these topics is shape design and optimization, which has the possibility of applications in many fields, such as nanotechnology and electronics. A properly constructed cost function can eliminate the need for labeled data required in deep learning and create desired shapes. In this work, the network parameters are optimized differentially, which differs from traditional approaches. The methods are tested for physics-related structures and successful results are obtained. This work is supported by Eskişehir Technical University scientific research project (Project No: 20ADP090)

Keywords: deep learning, shape design, optimization, artificial intelligence

Procedia PDF Downloads 133
28087 An Architecture Based on Capsule Networks for the Identification of Handwritten Signature Forgery

Authors: Luisa Mesquita Oliveira Ribeiro, Alexei Manso Correa Machado

Abstract:

Handwritten signature is a unique form for recognizing an individual, used to discern documents, carry out investigations in the criminal, legal, banking areas and other applications. Signature verification is based on large amounts of biometric data, as they are simple and easy to acquire, among other characteristics. Given this scenario, signature forgery is a worldwide recurring problem and fast and precise techniques are needed to prevent crimes of this nature from occurring. This article carried out a study on the efficiency of the Capsule Network in analyzing and recognizing signatures. The chosen architecture achieved an accuracy of 98.11% and 80.15% for the CEDAR and GPDS databases, respectively.

Keywords: biometrics, deep learning, handwriting, signature forgery

Procedia PDF Downloads 62
28086 The Exploration of the Physical Properties of the Combinations of Selenium-Based Ternary Chalcogenides AScSe₂ (A=K, Cs) for Photovoltaic Applications

Authors: Ayesha Asma, Aqsa Arooj

Abstract:

It is an essential need in this era of Science and Technology to investigate some unique and appropriate materials for optoelectronic applications. Here, we deliberated, for the first time, the structural, optoelectronic, mechanical, vibrational, and thermo dynamical properties of hexagonal structure selenium-based ternary chalcogenides AScSe₂ (A= K, Cs) by using Perdew-Burke-Ernzerhof Generalized-Gradient-Approximation (PBE-GGA). The lattice angles for these materials are found as α=β=90o and γ=120o. KScSe₂ optimized with lattice parameters a=b=4.3 (Å), c=7.81 (Å) whereas CsScSe₂ got relaxed at a=b=4.43 (Å) and c=8.51 (Å). However, HSE06 functional has overestimated the lattice parameters to the extent that for KScSe₂ a=b=4.92 (Å), c=7.10 (Å), and CsScSe₂ a=b=5.15 (Å), c=7.09 (Å). The energy band gap of these materials calculated via PBE-GGA and HSE06 functionals confirms their semiconducting nature. Concerning Born’s criteria, these materials are mechanically stable ones. Moreover, the temperature dependence of thermodynamic potentials and specific heat at constant volume are also determined while using the harmonic approximation. The negative values of free energy ensure their thermodynamic stability. The vibrational modes are calculated by plotting the phonon dispersion and the vibrational density of states (VDOS), where infrared (IR) and Raman spectroscopy are used to characterize the vibrational modes. The various optical parameters are examined at a smearing value of 0.5eV. These parameters unveil that these materials are good absorbers of incident light in ultra-violet (UV) regions and may be utilized in photovoltaic applications.

Keywords: structural, optimized, vibrational, ultraviolet

Procedia PDF Downloads 17
28085 Performance Evaluation and Comparison between the Empirical Mode Decomposition, Wavelet Analysis, and Singular Spectrum Analysis Applied to the Time Series Analysis in Atmospheric Science

Authors: Olivier Delage, Hassan Bencherif, Alain Bourdier

Abstract:

Signal decomposition approaches represent an important step in time series analysis, providing useful knowledge and insight into the data and underlying dynamics characteristics while also facilitating tasks such as noise removal and feature extraction. As most of observational time series are nonlinear and nonstationary, resulting of several physical processes interaction at different time scales, experimental time series have fluctuations at all time scales and requires the development of specific signal decomposition techniques. Most commonly used techniques are data driven, enabling to obtain well-behaved signal components without making any prior-assumptions on input data. Among the most popular time series decomposition techniques, most cited in the literature, are the empirical mode decomposition and its variants, the empirical wavelet transform and singular spectrum analysis. With increasing popularity and utility of these methods in wide ranging applications, it is imperative to gain a good understanding and insight into the operation of these algorithms. In this work, we describe all of the techniques mentioned above as well as their ability to denoise signals, to capture trends, to identify components corresponding to the physical processes involved in the evolution of the observed system and deduce the dimensionality of the underlying dynamics. Results obtained with all of these methods on experimental total ozone columns and rainfall time series will be discussed and compared

Keywords: denoising, empirical mode decomposition, singular spectrum analysis, time series, underlying dynamics, wavelet analysis

Procedia PDF Downloads 89
28084 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016

Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi

Abstract:

This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.

Keywords: big health data, data subject rights, GDPR, pandemic

Procedia PDF Downloads 113
28083 Exploring Unexplored Horizons: Innovative Applications of Applied Fluid Mechanics in Sustainable Energy

Authors: Elvira S. Castillo, Surupa Shaw

Abstract:

This paper delves into the uncharted territories of innovative applications of applied fluid mechanics in sustainable energy. By exploring the intersection of fluid mechanics principles with renewable energy technologies, the study uncovers untapped potential and novel solutions. Through theoretical analyses, the research investigates how fluid dynamics can be strategically leveraged to enhance the efficiency and sustainability of renewable energy systems. The findings contribute to expanding the discourse on sustainable energy by presenting innovative perspectives and practical insights. This paper serves as a guide for future research endeavors and offers valuable insights for implementing advanced methodologies and technologies to address global energy challenges.

Keywords: fluid mechanics, sustainable energy, sustainble practices, renewable energy

Procedia PDF Downloads 30
28082 Investigating the Viability of Ultra-Low Parameter Count Networks for Real-Time Football Detection

Authors: Tim Farrelly

Abstract:

In recent years, AI-powered object detection systems have opened the doors for innovative new applications and products, especially those operating in the real world or ‘on edge’ – namely, in sport. This paper investigates the viability of an ultra-low parameter convolutional neural network specially designed for the detection of footballs on ‘on the edge’ devices. The main contribution of this paper is the exploration of integrating new design features (depth-wise separable convolutional blocks and squeezed and excitation modules) into an ultra-low parameter network and demonstrating subsequent improvements in performance. The results show that tracking the ball from Full HD images with negligibly high accu-racy is possible in real-time.

Keywords: deep learning, object detection, machine vision applications, sport, network design

Procedia PDF Downloads 126
28081 The Normal-Generalized Hyperbolic Secant Distribution: Properties and Applications

Authors: Hazem M. Al-Mofleh

Abstract:

In this paper, a new four-parameter univariate continuous distribution called the Normal-Generalized Hyperbolic Secant Distribution (NGHS) is defined and studied. Some general and structural distributional properties are investigated and discussed, including: central and non-central n-th moments and incomplete moments, quantile and generating functions, hazard function, Rényi and Shannon entropies, shapes: skewed right, skewed left, and symmetric, modality regions: unimodal and bimodal, maximum likelihood (MLE) estimators for the parameters. Finally, two real data sets are used to demonstrate empirically its flexibility and prove the strength of the new distribution.

Keywords: bimodality, estimation, hazard function, moments, Shannon’s entropy

Procedia PDF Downloads 323
28080 Concealed Objects Detection in Visible, Infrared and Terahertz Ranges

Authors: M. Kowalski, M. Kastek, M. Szustakowski

Abstract:

Multispectral screening systems are becoming more popular because of their very interesting properties and applications. One of the most significant applications of multispectral screening systems is prevention of terrorist attacks. There are many kinds of threats and many methods of detection. Visual detection of objects hidden under clothing of a person is one of the most challenging problems of threats detection. There are various solutions of the problem; however, the most effective utilize multispectral surveillance imagers. The development of imaging devices and exploration of new spectral bands is a chance to introduce new equipment for assuring public safety. We investigate the possibility of long lasting detection of potentially dangerous objects covered with various types of clothing. In the article we present the results of comparative studies of passive imaging in three spectrums – visible, infrared and terahertz

Keywords: terahertz, infrared, object detection, screening camera, image processing

Procedia PDF Downloads 339
28079 Challenges of Blockchain Applications in the Supply Chain Industry: A Regulatory Perspective

Authors: Pardis Moslemzadeh Tehrani

Abstract:

Due to the emergence of blockchain technology and the benefits of cryptocurrencies, intelligent or smart contracts are gaining traction. Artificial intelligence (AI) is transforming our lives, and it is being embraced by a wide range of sectors. Smart contracts, which are at the heart of blockchains, incorporate AI characteristics. Such contracts are referred to as "smart" contracts because of the underlying technology that allows contracting parties to agree on terms expressed in computer code that defines machine-readable instructions for computers to follow under specific situations. The transmission happens automatically if the conditions are met. Initially utilised for financial transactions, blockchain applications have since expanded to include the financial, insurance, and medical sectors, as well as supply networks. Raw material acquisition by suppliers, design, and fabrication by manufacturers, delivery of final products to consumers, and even post-sales logistics assistance are all part of supply chains. Many issues are linked with managing supply chains from the planning and coordination stages, which can be implemented in a smart contract in a blockchain due to their complexity. Manufacturing delays and limited third-party amounts of product components have raised concerns about the integrity and accountability of supply chains for food and pharmaceutical items. Other concerns include regulatory compliance in multiple jurisdictions and transportation circumstances (for instance, many products must be kept in temperature-controlled environments to ensure their effectiveness). Products are handled by several providers before reaching customers in modern economic systems. Information is sent between suppliers, shippers, distributors, and retailers at every stage of the production and distribution process. Information travels more effectively when individuals are eliminated from the equation. The usage of blockchain technology could be a viable solution to these coordination issues. In blockchains, smart contracts allow for the rapid transmission of production data, logistical data, inventory levels, and sales data. This research investigates the legal and technical advantages and disadvantages of AI-blockchain technology in the supply chain business. It aims to uncover the applicable legal problems and barriers to the use of AI-blockchain technology to supply chains, particularly in the food industry. It also discusses the essential legal and technological issues and impediments to supply chain implementation for stakeholders, as well as methods for overcoming them before releasing the technology to clients. Because there has been little research done on this topic, it is difficult for industrial stakeholders to grasp how blockchain technology could be used in their respective operations. As a result, the focus of this research will be on building advanced and complex contractual terms in supply chain smart contracts on blockchains to cover all unforeseen supply chain challenges.

Keywords: blockchain, supply chain, IoT, smart contract

Procedia PDF Downloads 104
28078 Global Digital Peer-to-Peer (P2P) Lending Platform Empowering Rural India: Determinants of Funding

Authors: Ankur Mehra, M. V. Shivaani

Abstract:

With increasing digitization, the world is coming closer, not only in terms of informational flow but also in terms of capital flows. And micro-finance institutions (MFIs) have perfectly leveraged this digital world by resorting to the innovative digital social peer-to-peer (P2P) lending platforms, such as, Kiva. These digital P2P platforms bring together micro-borrowers and lenders from across the world. The main objective of this study is to understand the funding preferences of social investors primarily from developed countries (such as US, UK, Australia), lending money to borrowers from rural India at zero interest rates through Kiva. Further, the objective of this study is to increase awareness about such a platform among various MFIs engaged in providing micro-loans to those in need. The sample comprises of India based micro-loan applications posted by various MFIs on Kiva lending platform over the period Sept 2012-March 2016. Out of 7,359 loans, 256 loans failed to get funded by social investors. On an average a micro-loan with 30 days to expiry gets fully funded in 7,593 minutes or 5.27 days. 62% of the loans raised on Kiva are related to livelihood, 32.5% of the loans are for funding basic necessities and balance 5.5% loans are for funding education. 47% of the loan applications have more than one borrower; while, currency exchange risk is on the social lenders for 45% of the loans. Controlling for the loan amount and loan tenure, the analyses suggest that those loan applications where the number of borrowers is more than one have a lower chance of getting funded as compared to the loan applications made by a sole borrower. Such group applications also take more time to get funded. Further, loan application by a solo woman not only has a higher chance of getting funded but as such get funded faster. The results also suggest that those loan applications which are supported by an MFI that has a religious affiliation, not only have a lower chance of getting funded, but also take longer to get funded as compared to the loan applications posted by secular MFIs. The results do not support cross-border currency risk to be a factor in explaining the determinants of loan funding. Finally, analyses suggest that loans raised for the purpose of earning livelihood and education have a higher chance of getting funded and such loans get funded faster as compared to the loans applied for purposes related to basic necessities such a clothing, housing, food, health, and personal use. The results are robust to controls for ‘MFI dummy’ and ‘year dummy’. The key implication from this study is that global social investors tend to develop an emotional connect with single woman borrowers and consequently they get funded faster Hence, MFIs should look for alternative ways for funding loans whose purpose is to meet basic needs; while, more loans related to livelihood and education should be raised via digital platforms.

Keywords: P2P lending, social investing, fintech, financial inclusion

Procedia PDF Downloads 123
28077 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data

Authors: Sašo Pečnik, Borut Žalik

Abstract:

This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.

Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization

Procedia PDF Downloads 285
28076 A Discovery on the Symmetrical Pattern of Mirror Primes in P²: Applications in the Formal Proof of the Goldbach Conjecture

Authors: Yingxu Wang

Abstract:

The base 6 structure and properties of mirror primes are discovered in this work towards the proof of Goldbach Conjecture. This paper reveals a fundamental pattern on pairs of mirror primes adjacent to any even number nₑ > 2 with symmetrical distances on both sides determined by a methodology of Mirror Prime Decomposition (MPD). MPD leads to a formal proof of the Goldbach conjecture, which states that the conjecture holds because any pivot even number, nₑ > 2, is a sum of at least an adjacent pair of primes divided by 2. This work has not only revealed the analytic pattern of base 6 primes but also proven the infinitive validation of the Goldbach conjecture.

Keywords: number theory, primes, mirror primes, double recursive patterns, Goldbach conjecture, formal proof, mirror-prime decomposition, applications

Procedia PDF Downloads 36
28075 A Multi-Role Oriented Collaboration Platform for Distributed Disaster Reduction in China

Authors: Linyao Qiu, Zhiqiang Du

Abstract:

As the rapid development of urbanization, economic developments, and steady population growth in China, the widespread devastation, economic damages, and loss of human lives caused by numerous forms of natural disasters are becoming increasingly serious every year. Disaster management requires available and effective cooperation of different roles and organizations in whole process including mitigation, preparedness, response and recovery. Due to the imbalance of regional development in China, the disaster management capabilities of national and provincial disaster reduction centers are uneven. When an undeveloped area suffers from disaster, neither local reduction department could get first-hand information like high-resolution remote sensing images from satellites and aircrafts independently, nor sharing mechanism is provided for the department to access to data resources deployed in other place directly. Most existing disaster management systems operate in a typical passive data-centric mode and work for single department, where resources cannot be fully shared. The impediment blocks local department and group from quick emergency response and decision-making. In this paper, we introduce a collaborative platform for distributed disaster reduction. To address the issues of imbalance of sharing data sources and technology in the process of disaster reduction, we propose a multi-role oriented collaboration business mechanism, which is capable of scheduling and allocating for optimum utilization of multiple resources, to link various roles for collaborative reduction business in different place. The platform fully considers the difference of equipment conditions in different provinces and provide several service modes to satisfy technology need in disaster reduction. An integrated collaboration system based on focusing services mechanism is designed and implemented for resource scheduling, functional integration, data processing, task management, collaborative mapping, and visualization. Actual applications illustrate that the platform can well support data sharing and business collaboration between national and provincial department. It could significantly improve the capability of disaster reduction in China.

Keywords: business collaboration, data sharing, distributed disaster reduction, focusing service

Procedia PDF Downloads 279
28074 Evaluation of Commercial Herbicides for Weed Control and Yield under Direct Dry Seeded Rice Cultivation System in Pakistan

Authors: Sanaullah Jalil, Abid Majeed, Syed Haider Abbas

Abstract:

Direct dry seeded rice cultivation system is an emerging production technology in Pakistan. Weeds are a major constraint to the success of direct dry seeded rice (DDSR). Studies were carried out for two years during 2015 and 2016 to evaluate the performance of applications of pre-emergence herbicides (Top Max @ 2.25 lit/ha, Click @1.5 lit/ha and Pendimethaline @ 1.25 lit/ha) and post-emergence herbicides (Clover @ 200 g/ha, Pyranex Gold @ 250 g/ha, Basagran @ 2.50 lit/ha, Sunstar Gold @ 50 g/ha and Wardan @ 1.25 lit/ha) at rice research field area of National Agriculture Research Center (NARC), Islamabad. The experiments were laid out in Randomized Complete Block Design (RCBD) with three replications. All evaluated herbicides reduced weed density and biomass by a significant amount. The net plot size was 2.5 x 5 m with 10 rows. Basmati-385 was used as test variety of rice. Data indicated that Top Max and Click provided best weed control efficiency but suppressed the germination of rice seed which causes the lowest grain yield production (680.6 kg/ha and 314.5 kg/ha respectively). A weedy check plot contributed 524.7 kg/ha paddy yield with highest weed density. Pyranex Gold provided better weed control efficiency and contributed to significantly higher paddy yield 5116.6 kg/ha than that of all other herbicide applications followed by the Clover which give paddy yield 4241.7 kg/ha. The results of our study suggest that pre-emergence herbicides provided best weed control but not fit for direct dry seeded rice (DDSR) cultivation system, and therefore post-emergence herbicides (Pyranex Gold and Clover) can be suggested for weed control and higher yield.

Keywords: pyranex gold, clover, direct dry seeded rice (DDSR), yield

Procedia PDF Downloads 241
28073 Fabrication and Characterization of PPy/rGO|PPy/ZnO Composite with Varying Zno Concentration as Anode for Fuel Cell Applications

Authors: Bryan D. Llenarizas, Maria Carla F. Manzano

Abstract:

The rapid growth of electricity demand has led to a pursuit of alternative energy sources with high power output and not harmful to the environment. The fuel cell is a device that generates electricity via chemical reactions between the fuel and oxidant. Fuel cells have been known for decades, but the development of high-power output and durability was still one of the drawbacks of this energy source. This study investigates the potential of layer-by-layer composite for fuel cell applications. A two-electrode electrochemical cell was used for the galvanostatic electrochemical deposition method to fabricate a Polypyrrole/rGO|Polypyrrole/ZnO layer-by-layer composite material for fuel cell applications. In the synthesis, the first layer comprised 0.1M pyrrole monomer and 1mg of rGO, while the second layer had 0.1M pyrrole monomer and variations of ZnO concentration ranging from 0.08M up to 0.12M. A constant current density of 8mA/cm² was applied for 1 hour in fabricating each layer. Scanning electron microscopy (SEM) for the fabricated LBL material shows a globular surface with white spots. These white spots are the ZnO particles confirmed by energy-dispersive X-ray spectroscopy, indicating a successful deposition of the second layer onto the first layer. The observed surface morphology was consistent for each variation of ZnO concentrations. AC measurements were conducted to obtain the AC resistance of the fabricated film. Results show a decrease in AC resistance as the concentration of ZnO increases.

Keywords: anode, composite material, electropolymerization, fuel cell, galvanostatic, polypyrrole

Procedia PDF Downloads 54
28072 Estimating Destinations of Bus Passengers Using Smart Card Data

Authors: Hasik Lee, Seung-Young Kho

Abstract:

Nowadays, automatic fare collection (AFC) system is widely used in many countries. However, smart card data from many of cities does not contain alighting information which is necessary to build OD matrices. Therefore, in order to utilize smart card data, destinations of passengers should be estimated. In this paper, kernel density estimation was used to forecast probabilities of alighting stations of bus passengers and applied to smart card data in Seoul, Korea which contains boarding and alighting information. This method was also validated with actual data. In some cases, stochastic method was more accurate than deterministic method. Therefore, it is sufficiently accurate to be used to build OD matrices.

Keywords: destination estimation, Kernel density estimation, smart card data, validation

Procedia PDF Downloads 334