Search results for: Cloud computing
571 Simulating the Hot Hand Phenomenon in Basketball with Bayesian Hidden Markov Models
Authors: Gabriel Calvo, Carmen Armero, Luigi Spezia
Abstract:
A basketball player is said to have a hot hand if his/her performance is better than expected in different periods of time. A way to deal with this phenomenon is to make use of latent variables, which can indicate whether the player is ‘on fire’ or not. This work aims to model the hot hand phenomenon through a Bayesian hidden Markov model (HMM) with two states (cold and hot) and two different probability of success depending on the corresponding hidden state. This task is illustrated through a comprehensive simulation study. The simulated data sets emulate the field goal attempts in an NBA season from different profile players. This model can be a powerful tool to assess the ‘streakiness’ of each player, and it provides information about the general performance of the players during the match. Finally, the Bayesian HMM allows computing the posterior probability of any type of streak.Keywords: Bernoulli trials, field goals, latent variables, posterior distribution
Procedia PDF Downloads 190570 Computing Maximum Uniquely Restricted Matchings in Restricted Interval Graphs
Authors: Swapnil Gupta, C. Pandu Rangan
Abstract:
A uniquely restricted matching is defined to be a matching M whose matched vertices induces a sub-graph which has only one perfect matching. In this paper, we make progress on the open question of the status of this problem on interval graphs (graphs obtained as the intersection graph of intervals on a line). We give an algorithm to compute maximum cardinality uniquely restricted matchings on certain sub-classes of interval graphs. We consider two sub-classes of interval graphs, the former contained in the latter, and give O(|E|^2) time algorithms for both of them. It is to be noted that both sub-classes are incomparable to proper interval graphs (graphs obtained as the intersection graph of intervals in which no interval completely contains another interval), on which the problem can be solved in polynomial time.Keywords: uniquely restricted matching, interval graph, matching, induced matching, witness counting
Procedia PDF Downloads 389569 Solar and Wind Energy Potential Study of Sindh Province, Pakistan for Power Generation
Authors: M. Akhlaque Ahmed, Sidra A. Shaikh, Maliha A. Siddiqui, Adeel Tahir
Abstract:
Global and diffuse solar radiation on horizontal surface of southern sindh namely Karachi, Hyderabad, Nawabshah were carried out using sunshine hour data of the area to asses the feasibility of solar Energy utilization at Sindh province for power generation. From the observation, result is derived which shows a drastic variation in the diffuse and direct component of solar radiation for summer and winter for Southern Sindh that is both contributes 50% for Karachi and Hyderabad. In Nawabshah area, the contribution of diffuse solar radiation is low in monsoon months, July and August. The Kᴛ value of Nawabshah indicates a clear sky almost throughout the year. The percentage of diffuse radiation does not exceed more than 20%. In Nawabshah, the appearance of cloud is rare even in monsoon months. The estimated values indicate that Nawabshah has high solar potential whereas Karachi and Hyderabad has low solar potential. During the monsoon months, the southern part of Sind can utilize the hybrid system with wind power. Near Karachi and Hyderabad, the wind speed ranges between 6.2 to 6.9 m/sec. There exist a wind corridor near Karachi, Hyderabad, Gharo, Keti Bander and Shah Bander. The short fall of solar can be compensated by wind because in monsoon months July and August the wind speed are higher in the southern region of Sindh.Keywords: hybrid power system, power generation, solar and wind energy potential, southern Sindh
Procedia PDF Downloads 235568 Analyzing the Practicality of Drawing Inferences in Automation of Commonsense Reasoning
Authors: Chandan Hegde, K. Ashwini
Abstract:
Commonsense reasoning is the simulation of human ability to make decisions during the situations that we encounter every day. It has been several decades since the introduction of this subfield of artificial intelligence, but it has barely made some significant progress. The modern computing aids also have remained impotent in this regard due to the absence of a strong methodology towards commonsense reasoning development. Among several accountable reasons for the lack of progress, drawing inference out of commonsense knowledge-base stands out. This review paper emphasizes on a detailed analysis of representation of reasoning uncertainties and feasible prospects of programming aids for drawing inferences. Also, the difficulties in deducing and systematizing commonsense reasoning and the substantial progress made in reasoning that influences the study have been discussed. Additionally, the paper discusses the possible impacts of an effective inference technique in commonsense reasoning.Keywords: artificial intelligence, commonsense reasoning, knowledge base, uncertainty in reasoning
Procedia PDF Downloads 187567 Modeling and Simulation of the Tripod Gait of a Hexapod Robot
Authors: El Hansali Hasnaa, Bennani Mohammed
Abstract:
Hexapod legged robot’s missions, particularly in irregular and dangerous areas, require high stability and high precision. In this paper, we consider the rectangular architecture body of legged robots with six legs distributed symmetrically along two sides, each leg contains three degrees of freedom for greater mobility. The aim of this work is planning tripod gait trajectory, based on the computing of the kinematic model to determine the joint variables in the lifting and the propelling phases. For this, appropriate coordinate frames are attached to the body and legs in order to obtain clear representation and efficient generation of the system equations. A simulation in MATLAB software platform is developed to confirm the kinematic model and various trajectories to the tripod gait adopted by the hexapod robot in its locomotion.Keywords: hexapod legged robot, inverse kinematic model, simulation in MATLAB, tripod gait
Procedia PDF Downloads 277566 Monte Carlo Simulation of Magnetic Properties in Bit Patterned Media
Authors: O. D. Arbeláez-Echeverri, E. Restrepo-Parra, J. C. Riano-Rojas
Abstract:
A two dimensional geometric model of Bit Patterned Media is proposed, the model is based on the crystal structure of the materials commonly used to produce the nano islands in bit patterned materials and the possible defects that may arise from the interaction between the nano islands and the matrix material. The dynamic magnetic properties of the material are then computed using time aware integration methods for the multi spin Hamiltonian. The Hamiltonian takes into account both the spatial and topological disorder of the sample as well as the high perpendicular anisotropy that is pursued when building bit patterned media. The main finding of the research was the possibility of replicating the results of previous experiments on similar materials and the ability of computing the switching field distribution given the geometry of the material and the parameters required by the model.Keywords: nanostructures, Monte Carlo, pattern media, magnetic properties
Procedia PDF Downloads 503565 Computing Some Topological Descriptors of Single-Walled Carbon Nanotubes
Authors: Amir Bahrami
Abstract:
In the fields of chemical graph theory, molecular topology, and mathematical chemistry, a topological index or a descriptor index also known as a connectivity index is a type of a molecular descriptor that is calculated based on the molecular graph of a chemical compound. Topological indices are numerical parameters of a graph which characterize its topology and are usually graph invariant. Topological indices are used for example in the development of quantitative structure-activity relationships (QSARs) in which the biological activity or other properties of molecules are correlated with their chemical structure. In this paper some descriptor index (descriptor index) of single-walled carbon nanotubes, is determined.Keywords: chemical graph theory, molecular topology, molecular descriptor, single-walled carbon nanotubes
Procedia PDF Downloads 338564 Improving Temporal Correlations in Empirical Orthogonal Function Expansions for Data Interpolating Empirical Orthogonal Function Algorithm
Authors: Ping Bo, Meng Yunshan
Abstract:
Satellite-derived sea surface temperature (SST) is a key parameter for many operational and scientific applications. However, the disadvantage of SST data is a high percentage of missing data which is mainly caused by cloud coverage. Data Interpolating Empirical Orthogonal Function (DINEOF) algorithm is an EOF-based technique for reconstructing the missing data and has been widely used in oceanographic field. The reconstruction of SST images within a long time series using DINEOF can cause large discontinuities and one solution for this problem is to filter the temporal covariance matrix to reduce the spurious variability. Based on the previous researches, an algorithm is presented in this paper to improve the temporal correlations in EOF expansion. Similar with the previous researches, a filter, such as Laplacian filter, is implemented on the temporal covariance matrix, but the temporal relationship between two consecutive images which is used in the filter is considered in the presented algorithm, for example, two images in the same season are more likely correlated than those in the different seasons, hence the latter one is less weighted in the filter. The presented approach is tested for the monthly nighttime 4-km Advanced Very High Resolution Radiometer (AVHRR) Pathfinder SST for the long-term period spanning from 1989 to 2006. The results obtained from the presented algorithm are compared to those from the original DINEOF algorithm without filtering and from the DINEOF algorithm with filtering but without taking temporal relationship into account.Keywords: data interpolating empirical orthogonal function, image reconstruction, sea surface temperature, temporal filter
Procedia PDF Downloads 324563 3D Modelling and Numerical Analysis of Human Inner Ear by Means of Finite Elements Method
Authors: C. Castro-Egler, A. Durán-Escalante, A. García-González
Abstract:
This paper presents a method to generate a finite element model of the human auditory inner ear system. The geometric model has been realized using 2D images from a virtual model of temporal bones. A point cloud has been gotten manually from those images to construct a whole mesh with hexahedral elements. The main difference with the predecessor models is the spiral shape of the cochlea with its three scales completely defined: scala tympani, scala media and scala vestibuli; which are separate by basilar membrane and Reissner membrane. To validate this model, numerical simulations have been realised with two models: an isolated inner ear and a whole model of human auditory system. Ideal conditions of displacement are applied over the oval window in the isolated Inner Ear model. The whole model is made up of the outer auditory channel, the tympani, the ossicular chain, and the inner ear. The boundary condition for the whole model is 1Pa over the auditory channel entrance. The numerical simulations by FEM have been done using a harmonic analysis with a frequency range between 100-10.000 Hz with an interval of 100Hz. The following results have been carried out: basilar membrane displacement; the scala media pressure according to the cochlea length and the transfer function of the middle ear normalized with the pressure in the tympanic membrane. The basilar membrane displacements and the pressure in the scala media make it possible to validate the response in frequency of the basilar membrane.Keywords: finite elements method, human auditory system model, numerical analysis, 3D modelling cochlea
Procedia PDF Downloads 362562 Effect of Ionized Plasma Medium on the Radiation of a Rectangular Microstrip Antenna on Ferrite Substrate
Authors: Ayman Al Sawalha
Abstract:
This paper presents theoretical investigations on the radiation of rectangular microstrip antenna printed on a magnetized ferrite substrate Ni0.62Co0.02Fe1.948O4 in the presence of ionized plasma medium. The theoretical study of rectangular microstrip antenna in free space is carried out by applying the transmission line model combining with potential function techniques while hydrodynamic theory is used for it is analysis in plasma medium. By taking the biased and unbiased ferrite cases, far-field radiation patterns in free space and plasma medium are obtained which in turn are applied in computing radiated power, directivity, quality factor and bandwidth of antenna. It is found that the presence of plasma medium affects the performance of rectangular microstrip antenna structure significantly.Keywords: ferrite, microstrip antenna, plasma, radiation
Procedia PDF Downloads 323561 Global and Diffuse Solar Radiation Studies over Seven Cities of Sindh, Pakistan for Power Generation
Authors: M. A. Ahmed, Sidra A. Shaik
Abstract:
Global and diffuse solar radiation on horizontal surface over seven cities of Sindh namely Karachi, Hyderabad, Chore, Padidan, Nawabshah, Rohri and Jacobabad were carried out using sunshine hour data of the area to assess the feasibility of solar energy utilization at Sindh province. The result obtained shows a variation of direct and diffuse component of solar radiation in summer and winter months in southern Sindh (50% direct and 50% diffuse for Karachi, and Hyderabad) where there is a large variation in direct and diffuse component of solar radiation in summer and winter months in northern region (80% direct and 20% diffuse for Rohri and Jacobabad). In southern Sindh, the contribution of diffuse solar radiation is higher during the monsoon months (July and August). The sky remains clear during September to June. In northern Sindh (Rohri and Jacobabad) the contribution of diffuse solar radiation is low even in monsoon months i,e in July and August. The Kt value for northern Sindh indicates a clear sky. In northern part of the Sindh percentage of diffuse radiation does not exceed more than 20%. The appearance of cloud is rare. From the point of view of power generation, the estimated values indicate that northern part of Sindh has high solar potential while the southern part has low solar potential.Keywords: global and diffuse solar radiation, solar potential, Province of Sindh, solar radiation studies for power generation
Procedia PDF Downloads 317560 Fast and Non-Invasive Patient-Specific Optimization of Left Ventricle Assist Device Implantation
Authors: Huidan Yu, Anurag Deb, Rou Chen, I-Wen Wang
Abstract:
The use of left ventricle assist devices (LVADs) in patients with heart failure has been a proven and effective therapy for patients with severe end-stage heart failure. Due to the limited availability of suitable donor hearts, LVADs will probably become the alternative solution for patient with heart failure in the near future. While the LVAD is being continuously improved toward enhanced performance, increased device durability, reduced size, a better understanding of implantation management becomes critical in order to achieve better long-term blood supplies and less post-surgical complications such as thrombi generation. Important issues related to the LVAD implantation include the location of outflow grafting (OG), the angle of the OG, the combination between LVAD and native heart pumping, uniform or pulsatile flow at OG, etc. We have hypothesized that an optimal implantation of LVAD is patient specific. To test this hypothesis, we employ a novel in-house computational modeling technique, named InVascular, to conduct a systematic evaluation of cardiac output at aortic arch together with other pertinent hemodynamic quantities for each patient under various implantation scenarios aiming to get an optimal implantation strategy. InVacular is a powerful computational modeling technique that integrates unified mesoscale modeling for both image segmentation and fluid dynamics with the cutting-edge GPU parallel computing. It first segments the aortic artery from patient’s CT image, then seamlessly feeds extracted morphology, together with the velocity wave from Echo Ultrasound image of the same patient, to the computation model to quantify 4-D (time+space) velocity and pressure fields. Using one NVIDIA Tesla K40 GPU card, InVascular completes a computation from CT image to 4-D hemodynamics within 30 minutes. Thus it has the great potential to conduct massive numerical simulation and analysis. The systematic evaluation for one patient includes three OG anastomosis (ascending aorta, descending thoracic aorta, and subclavian artery), three combinations of LVAD and native heart pumping (1:1, 1:2, and 1:3), three angles of OG anastomosis (inclined upward, perpendicular, and inclined downward), and two LVAD inflow conditions (uniform and pulsatile). The optimal LVAD implantation is suggested through a comprehensive analysis of the cardiac output and related hemodynamics from the simulations over the fifty-four scenarios. To confirm the hypothesis, 5 random patient cases will be evaluated.Keywords: graphic processing unit (GPU) parallel computing, left ventricle assist device (LVAD), lumped-parameter model, patient-specific computational hemodynamics
Procedia PDF Downloads 133559 Diagnosis of Diabetes Using Computer Methods: Soft Computing Methods for Diabetes Detection Using Iris
Authors: Piyush Samant, Ravinder Agarwal
Abstract:
Complementary and Alternative Medicine (CAM) techniques are quite popular and effective for chronic diseases. Iridology is more than 150 years old CAM technique which analyzes the patterns, tissue weakness, color, shape, structure, etc. for disease diagnosis. The objective of this paper is to validate the use of iridology for the diagnosis of the diabetes. The suggested model was applied in a systemic disease with ocular effects. 200 subject data of 100 each diabetic and non-diabetic were evaluated. Complete procedure was kept very simple and free from the involvement of any iridologist. From the normalized iris, the region of interest was cropped. All 63 features were extracted using statistical, texture analysis, and two-dimensional discrete wavelet transformation. A comparison of accuracies of six different classifiers has been presented. The result shows 89.66% accuracy by the random forest classifier.Keywords: complementary and alternative medicine, classification, iridology, iris, feature extraction, disease prediction
Procedia PDF Downloads 407558 Fat-Tail Test of Regulatory DNA Sequences
Authors: Jian-Jun Shu
Abstract:
The statistical properties of CRMs are explored by estimating similar-word set occurrence distribution. It is observed that CRMs tend to have a fat-tail distribution for similar-word set occurrence. Thus, the fat-tail test with two fatness coefficients is proposed to distinguish CRMs from non-CRMs, especially from exons. For the first fatness coefficient, the separation accuracy between CRMs and exons is increased as compared with the existing content-based CRM prediction method – fluffy-tail test. For the second fatness coefficient, the computing time is reduced as compared with fluffy-tail test, making it very suitable for long sequences and large data-base analysis in the post-genome time. Moreover, these indexes may be used to predict the CRMs which have not yet been observed experimentally. This can serve as a valuable filtering process for experiment.Keywords: statistical approach, transcription factor binding sites, cis-regulatory modules, DNA sequences
Procedia PDF Downloads 290557 Variability of Climatic Elements in Nigeria Over Recent 100 Years
Authors: T. Salami, O. S. Idowu, N. J. Bello
Abstract:
Climatic variability is an essential issue when dealing with the issue of climate change. Variability of some climate parameter helps to determine how variable the climatic condition of a region will behave. The most important of these climatic variables which help to determine the climatic condition in an area are both the Temperature and Precipitation. This research deals with Longterm climatic variability in Nigeria. Variables examined in this analysis include near-surface temperature, near surface minimum temperature, maximum temperature, relative humidity, vapour pressure, precipitation, wet-day frequency and cloud cover using data ranging between 1901-2010. Analyses were carried out and the following methods were used: - Regression and EOF analysis. Results show that the annual average, minimum and maximum near-surface temperature all gradually increases from 1901 to 2010. And they are in the same case in a wet season and dry season. Minimum near-surface temperature, with its linear trends are significant for annual, wet season and dry season means. However, the diurnal temperature range decreases in the recent 100 years imply that the minimum near-surface temperature has increased more than the maximum. Both precipitation and wet day frequency decline from the analysis, demonstrating that Nigeria has become dryer than before by the way of rainfall. Temperature and precipitation variability has become very high during these periods especially in the Northern areas. Areas which had excessive rainfall were confronted with flooding and other related issues while area that had less precipitation were all confronted with drought. More practical issues will be presented.Keywords: climate, variability, flooding, excessive rainfall
Procedia PDF Downloads 384556 Exploration of Various Metrics for Partitioning of Cellular Automata Units for Efficient Reconfiguration of Field Programmable Gate Arrays (FPGAs)
Authors: Peter Tabatt, Christian Siemers
Abstract:
Using FPGA devices to improve the behavior of time-critical parts of embedded systems is a proven concept for years. With reconfigurable FPGA devices, the logical blocks can be partitioned and grouped into static and dynamic parts. The dynamic parts can be reloaded 'on demand' at runtime. This work uses cellular automata, which are constructed through compilation from (partially restricted) ANSI-C sources, to determine the suitability of various metrics for optimal partitioning. Significant metrics, in this case, are for example the area on the FPGA device for the partition, the pass count for loop constructs and communication characteristics to other partitions. With successful partitioning, it is possible to use smaller FPGA devices for the same requirements as with not reconfigurable FPGA devices or – vice versa – to use the same FPGAs for larger programs.Keywords: reconfigurable FPGA, cellular automata, partitioning, metrics, parallel computing
Procedia PDF Downloads 271555 Investigating the Form of the Generalised Equations of Motion of the N-Bob Pendulum and Computing Their Solution Using MATLAB
Authors: Divij Gupta
Abstract:
Pendular systems have a range of both mathematical and engineering applications, ranging from modelling the behaviour of a continuous mass-density rope to utilisation as Tuned Mass Dampers (TMD). Thus, it is of interest to study the differential equations governing the motion of such systems. Here we attempt to generalise these equations of motion for the plane compound pendulum with a finite number of N point masses. A Lagrangian approach is taken, and we attempt to find the generalised form for the Euler-Lagrange equations of motion for the i-th bob of the N -bob pendulum. The co-ordinates are parameterized as angular quantities to reduce the number of degrees of freedom from 2N to N to simplify the form of the equations. We analyse the form of these equations up to N = 4 to determine the general form of the equation. We also develop a MATLAB program to compute a solution to the system for a given input value of N and a given set of initial conditions.Keywords: classical mechanics, differential equation, lagrangian analysis, pendulum
Procedia PDF Downloads 208554 Low-Cost Fog Edge Computing for Smart Power Management and Home Automation
Authors: Belkacem Benadda, Adil Benabdellah, Boutheyna Souna
Abstract:
The Internet of Things (IoT) is an unprecedented creation. Electronics objects are now able to interact, share, respond and adapt to their environment on a much larger basis. Actual spread of these modern means of connectivity and solutions with high data volume exchange are affecting our ways of life. Accommodation is becoming an intelligent living space, not only suited to the people circumstances and desires, but also to systems constraints to make daily life simpler, cheaper, increase possibilities and achieve a higher level of services and luxury. In this paper we are as Internet access, teleworking, consumption monitoring, information search, etc.). This paper addresses the design and integration of a smart home, it also purposes an IoT solution that allows smart power consumption based on measurements from power-grid and deep learning analysis.Keywords: array sensors, IoT, power grid, FPGA, embedded
Procedia PDF Downloads 116553 Drivers on Climate in a Neotropical City: Urbanizations and Natural Variability
Authors: Nuria Vargas, Frances Rodriguez
Abstract:
Neotropical medium cities have opportunities to develop in a good manner. Xalapa City (Veracruz capital, Mexico) and its metropolitan region, near to the Gulf of Mexico, has already <1 million inhabitants, a medium city size, but it’s growing rapidly as several cities in Latin America. Inside a landscape where it had been a forest cloud and coffee land, emerges the city with an irregular topography. The rapid grow of the urbanization and the loss of vegetation has result in a change on the climate parameters. Frequently warms spells, floods and landslides had been impacted last 2 decades, also a higher incidence of dengue and diarrhea is mentioned in the region. Therefore, the analysis of hydrometeorological events is crucial to understand the role they play in its problem. The urbanization and others radiative forces has created a modulation that can explain the decadal climate changes on the Xalapa region. The Atlantic Multidecadal Oscillation directly influences the temperature and precipitation of the region, even more than climate change does. The total effect of these drivers can create a significant context that origin more risk. However, the most policies frequently consider only the climate change as a principal factor, but other drivers are important to consider and evaluate for the implementation of actions that improve our ambient and cities, in a context of climate change. Medium-sized cities could create better conditions for future citizens, preventing with urban planning that considers possible risks associated with weather and climate.Keywords: natural variability, urbanization, atlantic multidecadal oscillation, land use changes
Procedia PDF Downloads 64552 Wearable Music: Generation of Costumes from Music and Generative Art and Wearing Them by 3-Way Projectors
Authors: Noriki Amano
Abstract:
The final goal of this study is to create another way in which people enjoy music through the performance of 'Wearable Music'. Concretely speaking, we generate colorful costumes in real- time from music and to realize their dressing by projecting them to a person. For this purpose, we propose three methods in this study. First, a method of giving color to music in a three-dimensionally way. Second, a method of generating images of costumes from music. Third, a method of wearing the images of music. In particular, this study stands out from other related work in that we generate images of unique costumes from music and realize to wear them. In this study, we use the technique of generative arts to generate images of unique costumes and project the images to the fog generated around a person from 3-way using projectors. From this study, we can get how to enjoy music as 'wearable'. Furthermore, we are also able to have the prospect of unconventional entertainment based on the fusion between music and costumes.Keywords: entertainment computing, costumes, music, generative programming
Procedia PDF Downloads 173551 Performance Analysis and Optimization for Diagonal Sparse Matrix-Vector Multiplication on Machine Learning Unit
Authors: Qiuyu Dai, Haochong Zhang, Xiangrong Liu
Abstract:
Diagonal sparse matrix-vector multiplication is a well-studied topic in the fields of scientific computing and big data processing. However, when diagonal sparse matrices are stored in DIA format, there can be a significant number of padded zero elements and scattered points, which can lead to a degradation in the performance of the current DIA kernel. This can also lead to excessive consumption of computational and memory resources. In order to address these issues, the authors propose the DIA-Adaptive scheme and its kernel, which leverages the parallel instruction sets on MLU. The researchers analyze the effect of allocating a varying number of threads, clusters, and hardware architectures on the performance of SpMV using different formats. The experimental results indicate that the proposed DIA-Adaptive scheme performs well and offers excellent parallelism.Keywords: adaptive method, DIA, diagonal sparse matrices, MLU, sparse matrix-vector multiplication
Procedia PDF Downloads 134550 Cracks Detection and Measurement Using VLP-16 LiDAR and Intel Depth Camera D435 in Real-Time
Authors: Xinwen Zhu, Xingguang Li, Sun Yi
Abstract:
Crack is one of the most common damages in buildings, bridges, roads and so on, which may pose safety hazards. However, cracks frequently happen in structures of various materials. Traditional methods of manual detection and measurement, which are known as subjective, time-consuming, and labor-intensive, are gradually unable to meet the needs of modern development. In addition, crack detection and measurement need be safe considering space limitations and danger. Intelligent crack detection has become necessary research. In this paper, an efficient method for crack detection and quantification using a 3D sensor, LiDAR, and depth camera is proposed. This method works even in a dark environment, which is usual in real-world applications. The LiDAR rapidly spins to scan the surrounding environment and discover cracks through lasers thousands of times per second, providing a rich, 3D point cloud in real-time. The LiDAR provides quite accurate depth information. The precision of the distance of each point can be determined within around ±3 cm accuracy, and not only it is good for getting a precise distance, but it also allows us to see far of over 100m going with the top range models. But the accuracy is still large for some high precision structures of material. To make the depth of crack is much more accurate, the depth camera is in need. The cracks are scanned by the depth camera at the same time. Finally, all data from LiDAR and Depth cameras are analyzed, and the size of the cracks can be quantified successfully. The comparison shows that the minimum and mean absolute percentage error between measured and calculated width are about 2.22% and 6.27%, respectively. The experiments and results are presented in this paper.Keywords: LiDAR, depth camera, real-time, detection and measurement
Procedia PDF Downloads 224549 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression
Authors: Anne M. Denton, Rahul Gomes, David W. Franzen
Abstract:
High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression
Procedia PDF Downloads 129548 Managers’ Mobile Information Behavior in an Openness Paradigm Era
Authors: Abd Latif Abdul Rahman, Zuraidah Arif, Muhammad Faizal Iylia, Mohd Ghazali, Asmadi Mohammed Ghazali
Abstract:
Mobile information is a significant access point for human information activities. Theories and models of human information behavior have developed over several decades but have not yet considered the role of the user’s computing device in digital information interactions. This paper reviews the literature that leads to developing a conceptual framework of a study on the managers mobile information behavior. Based on the literature review, dimensions of mobile information behavior are identified, namely, dimension information needs, dimension information access, information retrieval and dimension of information use. The study is significant to understand the nature of librarians’ behavior in searching, retrieving and using information via the mobile device. Secondly, the study would provide suggestions about various kinds of mobile applications which organization can provide for their staff to improve their services.Keywords: mobile information behavior, information behavior, mobile information, mobile devices
Procedia PDF Downloads 349547 Multi-Scaled Non-Local Means Filter for Medical Images Denoising: Empirical Mode Decomposition vs. Wavelet Transform
Authors: Hana Rabbouch
Abstract:
In recent years, there has been considerable growth of denoising techniques mainly devoted to medical imaging. This important evolution is not only due to the progress of computing techniques, but also to the emergence of multi-resolution analysis (MRA) on both mathematical and algorithmic bases. In this paper, a comparative study is conducted between the two best-known MRA-based decomposition techniques: the Empirical Mode Decomposition (EMD) and the Discrete Wavelet Transform (DWT). The comparison is carried out in a framework of multi-scale denoising, where a Non-Local Means (NLM) filter is performed scale-by-scale to a sample of benchmark medical images. The results prove the effectiveness of the multiscaled denoising, especially when the NLM filtering is coupled with the EMD.Keywords: medical imaging, non local means, denoising, multiscaled analysis, empirical mode decomposition, wavelets
Procedia PDF Downloads 141546 Using Geospatial Analysis to Reconstruct the Thunderstorm Climatology for the Washington DC Metropolitan Region
Authors: Mace Bentley, Zhuojun Duan, Tobias Gerken, Dudley Bonsal, Henry Way, Endre Szakal, Mia Pham, Hunter Donaldson, Chelsea Lang, Hayden Abbott, Leah Wilcynzski
Abstract:
Air pollution has the potential to modify the lifespan and intensity of thunderstorms and the properties of lightning. Using data mining and geovisualization, we investigate how background climate and weather conditions shape variability in urban air pollution and how this, in turn, shapes thunderstorms as measured by the intensity, distribution, and frequency of cloud-to-ground lightning. A spatiotemporal analysis was conducted in order to identify thunderstorms using high-resolution lightning detection network data. Over seven million lightning flashes were used to identify more than 196,000 thunderstorms that occurred between 2006 - 2020 in the Washington, DC Metropolitan Region. Each lightning flash in the dataset was grouped into thunderstorm events by means of a temporal and spatial clustering algorithm. Once the thunderstorm event database was constructed, hourly wind direction, wind speed, and atmospheric thermodynamic data were added to the initiation and dissipation times and locations for the 196,000 identified thunderstorms. Hourly aerosol and air quality data for the thunderstorm initiation times and locations were also incorporated into the dataset. Developing thunderstorm climatologies using a lightning tracking algorithm and lightning detection network data was found to be useful for visualizing the spatial and temporal distribution of urban augmented thunderstorms in the region.Keywords: lightning, urbanization, thunderstorms, climatology
Procedia PDF Downloads 75545 Cryptography and Cryptosystem a Panacea to Security Risk in Wireless Networking
Authors: Modesta E. Ezema, Chikwendu V. Alabekee, Victoria N. Ishiwu, Ifeyinwa NwosuArize, Chinedu I. Nwoye
Abstract:
The advent of wireless networking in computing technology cannot be overemphasized, it opened up easy accessibility to information resources, networking made easier and brought internet accessibility to our doorsteps, but despite all these, some mishap came in with it that is causing mayhem in today ‘s overall information security. The cyber criminals will always compromise the integrity of a message that is not encrypted or that is encrypted with a weak algorithm.In other to correct the mayhem, this study focuses on cryptosystem and cryptography. This ensures end to end crypt messaging. The study of various cryptographic algorithms, as well as the techniques and applications of the cryptography for efficiency, were all considered in the work., present and future applications of cryptography were dealt with as well as Quantum Cryptography was exposed as the current and the future area in the development of cryptography. An empirical study was conducted to collect data from network users.Keywords: algorithm, cryptography, cryptosystem, network
Procedia PDF Downloads 349544 Automatic Number Plate Recognition System Based on Deep Learning
Authors: T. Damak, O. Kriaa, A. Baccar, M. A. Ben Ayed, N. Masmoudi
Abstract:
In the last few years, Automatic Number Plate Recognition (ANPR) systems have become widely used in the safety, the security, and the commercial aspects. Forethought, several methods and techniques are computing to achieve the better levels in terms of accuracy and real time execution. This paper proposed a computer vision algorithm of Number Plate Localization (NPL) and Characters Segmentation (CS). In addition, it proposed an improved method in Optical Character Recognition (OCR) based on Deep Learning (DL) techniques. In order to identify the number of detected plate after NPL and CS steps, the Convolutional Neural Network (CNN) algorithm is proposed. A DL model is developed using four convolution layers, two layers of Maxpooling, and six layers of fully connected. The model was trained by number image database on the Jetson TX2 NVIDIA target. The accuracy result has achieved 95.84%.Keywords: ANPR, CS, CNN, deep learning, NPL
Procedia PDF Downloads 306543 The UAV Feasibility Trajectory Prediction Using Convolution Neural Networks
Authors: Adrien Marque, Daniel Delahaye, Pierre Maréchal, Isabelle Berry
Abstract:
Wind direction and uncertainty are crucial in aircraft or unmanned aerial vehicle trajectories. By computing wind covariance matrices on each spatial grid point, these spatial grids can be defined as images with symmetric positive definite matrix elements. A data pre-processing step, a specific convolution, a specific max-pooling, and a specific flatten layers are implemented to process such images. Then, the neural network is applied to spatial grids, whose elements are wind covariance matrices, to solve classification problems related to the feasibility of unmanned aerial vehicles based on wind direction and wind uncertainty.Keywords: wind direction, uncertainty level, unmanned aerial vehicle, convolution neural network, SPD matrices
Procedia PDF Downloads 49542 Application of Deep Learning in Top Pair and Single Top Quark Production at the Large Hadron Collider
Authors: Ijaz Ahmed, Anwar Zada, Muhammad Waqas, M. U. Ashraf
Abstract:
We demonstrate the performance of a very efficient tagger applies on hadronically decaying top quark pairs as signal based on deep neural network algorithms and compares with the QCD multi-jet background events. A significant enhancement of performance in boosted top quark events is observed with our limited computing resources. We also compare modern machine learning approaches and perform a multivariate analysis of boosted top-pair as well as single top quark production through weak interaction at √s = 14 TeV proton-proton Collider. The most relevant known background processes are incorporated. Through the techniques of Boosted Decision Tree (BDT), likelihood and Multlayer Perceptron (MLP) the analysis is trained to observe the performance in comparison with the conventional cut based and count approachKeywords: top tagger, multivariate, deep learning, LHC, single top
Procedia PDF Downloads 111