Search results for: and model-based techniques
6301 Smart Online Library Catalog System with Query Expansion for the University of the Cordilleras
Authors: Vincent Ballola, Raymund Dilan, Thelma Palaoag
Abstract:
The Smart Online Library Catalog System with Query Expansion seeks to address the low usage of the library because of the emergence of the Internet. Library users are not accustomed to catalog systems that need a query to have the exact words without any mistakes for decent results to appear. The graphical user interface of the current system has a rather skewed learning curve for users to adapt with. With a simple graphical user interface inspired by Google, users can search quickly just by inputting their query and hitting the search button. Because of the query expansion techniques incorporated into the new system such as stemming, thesaurus search, and weighted search, users can have more efficient results from their query. The system will be adding the root words of the user's query to the query itself which will then be cross-referenced to a thesaurus database to search for any synonyms that will be added to the query. The results will then be arranged by the number of times the word has been searched. Online queries will also be added to the results for additional references. Users showed notable increases in efficiency and usability due to the familiar interface and query expansion techniques incorporated in the system. The simple yet familiar design led to a better user experience. Users also said that they would be more inclined in using the library because of the new system. The incorporation of query expansion techniques gives a notable increase of results to users that in turn gives them a wider range of resources found in the library. Used books mean more knowledge imparted to the users.Keywords: query expansion, catalog system, stemming, weighted search, usability, thesaurus search
Procedia PDF Downloads 3886300 Person Re-Identification using Siamese Convolutional Neural Network
Authors: Sello Mokwena, Monyepao Thabang
Abstract:
In this study, we propose a comprehensive approach to address the challenges in person re-identification models. By combining a centroid tracking algorithm with a Siamese convolutional neural network model, our method excels in detecting, tracking, and capturing robust person features across non-overlapping camera views. The algorithm efficiently identifies individuals in the camera network, while the neural network extracts fine-grained global features for precise cross-image comparisons. The approach's effectiveness is further accentuated by leveraging the camera network topology for guidance. Our empirical analysis on benchmark datasets highlights its competitive performance, particularly evident when background subtraction techniques are selectively applied, underscoring its potential in advancing person re-identification techniques.Keywords: camera network, convolutional neural network topology, person tracking, person re-identification, siamese
Procedia PDF Downloads 726299 Presenting a Knowledge Mapping Model According to a Comparative Study on Applied Models and Approaches to Map Organizational Knowledge
Authors: Ahmad Aslizadeh, Farid Ghaderi
Abstract:
Mapping organizational knowledge is an innovative concept and useful instrument of representation, capturing and visualization of implicit and explicit knowledge. There are a diversity of methods, instruments and techniques presented by different researchers following mapping organizational knowledge to reach determined goals. Implicating of these methods, it is necessary to know their exigencies and conditions in which those can be used. Integrating identified methods of knowledge mapping and comparing them would help knowledge managers to select the appropriate methods. This research conducted to presenting a model and framework to map organizational knowledge. At first, knowledge maps, their applications and necessity are introduced because of extracting comparative framework and detection of their structure. At the next step techniques of researchers such as Eppler, Kim, Egbu, Tandukar and Ebner as knowledge mapping models are presented and surveyed. Finally, they compare and a superior model would be introduced.Keywords: knowledge mapping, knowledge management, comparative study, business and management
Procedia PDF Downloads 4036298 The Impact of Household Income on Students' Financial Literacy
Authors: Dorjana Nano
Abstract:
Financial literacy has become on focus of many research studies. Family household is found to influence students’ financial literacy. The purpose of this study is to explore whether financial literacy of Albanian students is associated with their family household. The main objectives of this research are: i) firstly, to evaluate how financial literate are Albanian university students; ii) secondly, to examine whether the financial literacy differs based on the level of students family income; and iii) finally, to draw some conclusions and recommendations in order to improve student’s financial literacy. An instrument, comprised of personal finance and personal characteristics is administered to 637 students in Albania. The constituency of the survey is tested based on the dimension reduction and factor analyzing techniques. The One Way Welch ANOVA and multiple comparison techniques are utilized to analyze the data. The results indicate that student’s financial literacy is influenced by their family income.Keywords: financial literacy, household income, smart decisions, university students
Procedia PDF Downloads 2726297 Rural Water Management Strategies and Irrigation Techniques for Sustainability. Nigeria Case Study; Kwara State
Authors: Faith Eweluegim Enahoro-Ofagbe
Abstract:
Water is essential for sustaining life. As a limited resource, effective water management is vital. Water scarcity has become more common due to the effects of climate change, land degradation, deforestation, and population growth, especially in rural communities, which are more susceptible to water-related issues such as water shortage, water-borne disease, et c., due to the unsuccessful implementation of water policies and projects in Nigeria. Since rural communities generate the majority of agricultural products, they significantly impact on water management for sustainability. The development of methods to advance this goal for residential and agricultural usage in the present and the future is a challenge for rural residents. This study evaluated rural water supply systems and irrigation management techniques to conserve water in Kwara State, North-Central Nigeria. Suggesting some measures to conserve water resources for sustainability, off-season farming, and socioeconomic security that will remedy water degradation, unemployment which is one of the causes of insecurity in the country, by considering the use of fabricated or locally made irrigation equipment, which are affordable by rural farmers, among other recommendations. Questionnaires were distributed to respondents in the study area for quantitative evaluation of irrigation methods practices. For physicochemical investigation, samples were also gathered from their available water sources. According to the study's findings, 30 percent of farmers adopted intelligent irrigation management techniques to conserve water resources, saving 45% of the water previously used for irrigation. 70 % of farmers practice seasonal farming. Irrigation water is drawn from river channels, streams, and unlined and unprotected wells. 60% of these rural residents rely on private boreholes for their water needs, while 40% rely on government-supplied rural water. Therefore, the government must develop additional water projects, raise awareness, and offer irrigation techniques that are simple to adapt for water management, increasing socio-economic productivity, security, and water sustainability.Keywords: water resource management, sustainability, irrigation, rural water management, irrigation management technique
Procedia PDF Downloads 1116296 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm
Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin
Abstract:
Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform
Procedia PDF Downloads 5336295 Automatic Calibration of Agent-Based Models Using Deep Neural Networks
Authors: Sima Najafzadehkhoei, George Vega Yon
Abstract:
This paper presents an approach for calibrating Agent-Based Models (ABMs) efficiently, utilizing Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks. These machine learning techniques are applied to Susceptible-Infected-Recovered (SIR) models, which are a core framework in the study of epidemiology. Our method replicates parameter values from observed trajectory curves, enhancing the accuracy of predictions when compared to traditional calibration techniques. Through the use of simulated data, we train the models to predict epidemiological parameters more accurately. Two primary approaches were explored: one where the number of susceptible, infected, and recovered individuals is fully known, and another using only the number of infected individuals. Our method shows promise for application in other ABMs where calibration is computationally intensive and expensive.Keywords: ABM, calibration, CNN, LSTM, epidemiology
Procedia PDF Downloads 246294 The Study of Dengue Fever Outbreak in Thailand Using Geospatial Techniques, Satellite Remote Sensing Data and Big Data
Authors: Tanapat Chongkamunkong
Abstract:
The objective of this paper is to present a practical use of Geographic Information System (GIS) to the public health from spatial correlation between multiple factors and dengue fever outbreak. Meteorological factors, demographic factors and environmental factors are compiled using GIS techniques along with the Global Satellite Mapping Remote Sensing (RS) data. We use monthly dengue fever cases, population density, precipitation, Digital Elevation Model (DEM) data. The scope cover study area under climate change of the El Niño–Southern Oscillation (ENSO) indicated by sea surface temperature (SST) and study area in 12 provinces of Thailand as remote sensing (RS) data from January 2007 to December 2014.Keywords: dengue fever, sea surface temperature, Geographic Information System (GIS), remote sensing
Procedia PDF Downloads 1986293 Impact of Minimalism in Dance Education on the Development of Aesthetic Sensibilities
Authors: Meghamala Nugehally
Abstract:
This paper hypothesises and draws inferences on the impact of minimalism in dance education on the development of artistic and aesthetic sensibilities in individuals in the age group of 5-18 yrs of age. This research and conclusions are within the context of Indian Classical Dance, which is based on Indian theories of aesthetics drawn from the Natyashastra, an ancient treatise on Indian dance and drama. The research employs training methods handed down through a strict one-on-one teacher-student tradition known as the Guru-Shishya Parampara. Aesthetic principles used are defined, and basic theories from the Natyashastra are explained to provide background for the research design. The paper also discusses dance curriculum design and training methodology design within the context of these aesthetic theories. The scope of the research is limited to two genres of Indian classical forms: Bharatanatyam and Odissi. A brief description of these dance forms is given as background and dance aesthetics specific to these forms are described. The research design includes individual case studies of subjects studied, independent predetermined attributes for observations and a qualitative scoring methodology devised for the purpose of the study. The study describes the training techniques used and contrasts minimal solo training techniques with the more elaborate group training techniques. Study groups were divided and the basis for the division are discussed. Study observations are recorded and presented as evidences. The results inform the conclusion and set the stage for further research in this area.Keywords: dance aesthetics, dance education, Indian classical dance, minimalism
Procedia PDF Downloads 2286292 Training a Neural Network Using Input Dropout with Aggressive Reweighting (IDAR) on Datasets with Many Useless Features
Authors: Stylianos Kampakis
Abstract:
This paper presents a new algorithm for neural networks called “Input Dropout with Aggressive Re-weighting” (IDAR) aimed specifically at datasets with many useless features. IDAR combines two techniques (dropout of input neurons and aggressive re weighting) in order to eliminate the influence of noisy features. The technique can be seen as a generalization of dropout. The algorithm is tested on two different benchmark data sets: a noisy version of the iris dataset and the MADELON data set. Its performance is compared against three other popular techniques for dealing with useless features: L2 regularization, LASSO and random forests. The results demonstrate that IDAR can be an effective technique for handling data sets with many useless features.Keywords: neural networks, feature selection, regularization, aggressive reweighting
Procedia PDF Downloads 4556291 A Case Study of Physical and Psychological Forces in the Nigerian Criminal and Military Interrogations
Authors: Onimisi Ekuh Abdullahi, Lasbat Omoshalewa Akinsemoyin
Abstract:
In Nigeria, over two decades now, there has been a steady increase in the insecurity of human lives and physical properties. In the South-South Nigeria, there is an acute insecurity of militants destroying oil pipe-lines and kidnapping cases; in the Middle-Belt zone, insecurity centers on kidnapping and in a few states crises between Herdsmen and Farmers range like wildfire; in the South-Western zone, kidnapping is vile, in the North-East zone the issue of Boko Haram has become World-wide concern, and in North-west zone, cattle rustlers and religious crisis are of great concern. At the initial stage, the Nigerian Police Force was called upon to quell the crisis. It soon became obvious that the dimension of the crisis was beyond police force. The Nigerian Armed Forces were called to maintain peace and order because the magnitude of the crisis was threatening the national unity and cohesion. The main objective of this paper, was to examine the investigative techniques of criminal by the military in Nigeria. Specifically to examine the physical and psychological force; the abusive techniques and tactics; and suggest modern psychological techniques of interrogating criminals accepted to Human Right Activists and the rule of law. The process is to create room behaviour and practices that carefully monitored the trust and reliability of admissions produced by Psychological manipulative process in Nigeria.Keywords: military, Nigerian criminal, physical, psychological force
Procedia PDF Downloads 1606290 Perceptions on Development of the Deaf in Higher Education Level: The Case of Special Education Students in Tiaong, Quezon, Philippines
Authors: Ashley Venerable, Rosario Tatlonghari
Abstract:
This study identified how college deaf students of Bartimaeus Center for Alternative Learning in Tiaong, Quezon, Philippines view development using visual communication techniques and generating themes from responses. Complete enumeration was employed. Guided by Constructivist Theory of Perception, past experiences and stored information influenced perception. These themes of development emerged: social development; pleasant environment; interpersonal relationships; availability of resources; employment; infrastructure development; values; and peace and security. Using the National Economic and Development Authority development indicators, findings showed the deaf students’ views on development were similar from the mainstream views. Responses also became more meaningful through visual communication techniques.Keywords: deaf, development, perception, development indicators, visual communication
Procedia PDF Downloads 4316289 Novel Formal Verification Based Coverage Augmentation Technique
Authors: Surinder Sood, Debajyoti Mukherjee
Abstract:
Formal verification techniques have become widely popular in pre-silicon verification as an alternate to constrain random simulation based techniques. This paper proposed a novel formal verification-based coverage augmentation technique in verifying complex RTL functional verification faster. The proposed approach relies on augmenting coverage analysis coming from simulation and formal verification. Besides this, the functional qualification framework not only helps in improving the coverage at a faster pace but also aids in maturing and qualifying the formal verification infrastructure. The proposed technique has helped to achieve faster verification sign-off, resulting in faster time-to-market. The design picked had a complex control and data path and had many configurable options to meet multiple specification needs. The flow is generic, and tool independent, thereby leveraging across the projects and design will be much easierKeywords: COI (cone of influence), coverage, formal verification, fault injection
Procedia PDF Downloads 1246288 An Outsourcing System Model for the Thai Electrical Appliances Industry
Authors: Sudawan Somjai
Abstract:
The purpose of this paper was to find an appropriate outsourcing system model for the Thai electrical appliances industry. The objective was to increase competitive capability of the industry with an outsourcing system. The population for this study was the staff in the selected 10 companies in Thai electrical appliances industry located in Bangkok and the eastern part of Thailand. Data collecting techniques included in-depth interviews, focus group and storytelling techniques. The data was collected from 5 key informants from each company, making a total of 50 informants. The findings revealed that an outsourcing model would consist of important factors including outsourcing system, labor flexibility, capability of business process, manpower management efficiency, cost reduction, business risk elimination, core competency and competitiveness. Different suggestions were made as well in this research paper.Keywords: outsourcing system, model, Thailand, electrical appliances industry
Procedia PDF Downloads 5906287 Understanding and Improving Neural Network Weight Initialization
Authors: Diego Aguirre, Olac Fuentes
Abstract:
In this paper, we present a taxonomy of weight initialization schemes used in deep learning. We survey the most representative techniques in each class and compare them in terms of overhead cost, convergence rate, and applicability. We also introduce a new weight initialization scheme. In this technique, we perform an initial feedforward pass through the network using an initialization mini-batch. Using statistics obtained from this pass, we initialize the weights of the network, so the following properties are met: 1) weight matrices are orthogonal; 2) ReLU layers produce a predetermined number of non-zero activations; 3) the output produced by each internal layer has a unit variance; 4) weights in the last layer are chosen to minimize the error in the initial mini-batch. We evaluate our method on three popular architectures, and a faster converge rates are achieved on the MNIST, CIFAR-10/100, and ImageNet datasets when compared to state-of-the-art initialization techniques.Keywords: deep learning, image classification, supervised learning, weight initialization
Procedia PDF Downloads 1356286 Anomaly Detection of Log Analysis using Data Visualization Techniques for Digital Forensics Audit and Investigation
Authors: Mohamed Fadzlee Sulaiman, Zainurrasyid Abdullah, Mohd Zabri Adil Talib, Aswami Fadillah Mohd Ariffin
Abstract:
In common digital forensics cases, investigation may rely on the analysis conducted on specific and relevant exhibits involved. Usually the investigation officer may define and advise digital forensic analyst about the goals and objectives to be achieved in reconstructing the trail of evidence while maintaining the specific scope of investigation. With the technology growth, people are starting to realize the importance of cyber security to their organization and this new perspective creates awareness that digital forensics auditing must come in place in order to measure possible threat or attack to their cyber-infrastructure. Instead of performing investigation on incident basis, auditing may broaden the scope of investigation to the level of anomaly detection in daily operation of organization’s cyber space. While handling a huge amount of data such as log files, performing digital forensics audit for large organization proven to be onerous task for the analyst either to analyze the huge files or to translate the findings in a way where the stakeholder can clearly understand. Data visualization can be emphasized in conducting digital forensic audit and investigation to resolve both needs. This study will identify the important factors that should be considered to perform data visualization techniques in order to detect anomaly that meet the digital forensic audit and investigation objectives.Keywords: digital forensic, data visualization, anomaly detection , log analysis, forensic audit, visualization techniques
Procedia PDF Downloads 2876285 Validation of Asymptotic Techniques to Predict Bistatic Radar Cross Section
Authors: M. Pienaar, J. W. Odendaal, J. C. Smit, J. Joubert
Abstract:
Simulations are commonly used to predict the bistatic radar cross section (RCS) of military targets since characterization measurements can be expensive and time consuming. It is thus important to accurately predict the bistatic RCS of targets. Computational electromagnetic (CEM) methods can be used for bistatic RCS prediction. CEM methods are divided into full-wave and asymptotic methods. Full-wave methods are numerical approximations to the exact solution of Maxwell’s equations. These methods are very accurate but are computationally very intensive and time consuming. Asymptotic techniques make simplifying assumptions in solving Maxwell's equations and are thus less accurate but require less computational resources and time. Asymptotic techniques can thus be very valuable for the prediction of bistatic RCS of electrically large targets, due to the decreased computational requirements. This study extends previous work by validating the accuracy of asymptotic techniques to predict bistatic RCS through comparison with full-wave simulations as well as measurements. Validation is done with canonical structures as well as complex realistic aircraft models instead of only looking at a complex slicy structure. The slicy structure is a combination of canonical structures, including cylinders, corner reflectors and cubes. Validation is done over large bistatic angles and at different polarizations. Bistatic RCS measurements were conducted in a compact range, at the University of Pretoria, South Africa. The measurements were performed at different polarizations from 2 GHz to 6 GHz. Fixed bistatic angles of β = 30.8°, 45° and 90° were used. The measurements were calibrated with an active calibration target. The EM simulation tool FEKO was used to generate simulated results. The full-wave multi-level fast multipole method (MLFMM) simulated results together with the measured data were used as reference for validation. The accuracy of physical optics (PO) and geometrical optics (GO) was investigated. Differences relating to amplitude, lobing structure and null positions were observed between the asymptotic, full-wave and measured data. PO and GO were more accurate at angles close to the specular scattering directions and the accuracy seemed to decrease as the bistatic angle increased. At large bistatic angles PO did not perform well due to the shadow regions not being treated appropriately. PO also did not perform well for canonical structures where multi-bounce was the main scattering mechanism. PO and GO do not account for diffraction but these inaccuracies tended to decrease as the electrical size of objects increased. It was evident that both asymptotic techniques do not properly account for bistatic structural shadowing. Specular scattering was calculated accurately even if targets did not meet the electrically large criteria. It was evident that the bistatic RCS prediction performance of PO and GO depends on incident angle, frequency, target shape and observation angle. The improved computational efficiency of the asymptotic solvers yields a major advantage over full-wave solvers and measurements; however, there is still much room for improvement of the accuracy of these asymptotic techniques.Keywords: asymptotic techniques, bistatic RCS, geometrical optics, physical optics
Procedia PDF Downloads 2586284 Estimation of Coefficients of Ridge and Principal Components Regressions with Multicollinear Data
Authors: Rajeshwar Singh
Abstract:
The presence of multicollinearity is common in handling with several explanatory variables simultaneously due to exhibiting a linear relationship among them. A great problem arises in understanding the impact of explanatory variables on the dependent variable. Thus, the method of least squares estimation gives inexact estimates. In this case, it is advised to detect its presence first before proceeding further. Using the ridge regression degree of its occurrence is reduced but principal components regression gives good estimates in this situation. This paper discusses well-known techniques of the ridge and principal components regressions and applies to get the estimates of coefficients by both techniques. In addition to it, this paper also discusses the conflicting claim on the discovery of the method of ridge regression based on available documents.Keywords: conflicting claim on credit of discovery of ridge regression, multicollinearity, principal components and ridge regressions, variance inflation factor
Procedia PDF Downloads 4196283 Longitudinal Static and Dynamic Stability of a Typical Reentry Body in Subsonic Conditions Using Computational Fluid Dynamics
Authors: M. Jathaveda, Joben Leons, G. Vidya
Abstract:
Reentry from orbit is a critical phase in the entry trajectory. For a non-propulsive ballistic entry, static and dynamic stability play an important role in the trajectory, especially for the safe deployment of parachutes, typically at subsonic Mach numbers. Static stability of flight vehicles are being estimated through CFD techniques routinely. Advances in CFD software as well as computational facilities have enabled the estimation of the dynamic stability derivatives also through CFD techniques. Longitudinal static and dynamic stability of a typical reentry body for subsonic Mach number of 0.6 is predicted using commercial software CFD++ and presented here. Steady state simulations are carried out for α = 2° on an unstructured grid using SST k-ω model. Transient simulation using forced oscillation method is used to compute pitch damping derivatives.Keywords: stability, typical reentry body, subsonic, static and dynamic
Procedia PDF Downloads 1166282 A Sui Generis Technique to Detect Pathogens in Post-Partum Breast Milk Using Image Processing Techniques
Authors: Yogesh Karunakar, Praveen Kandaswamy
Abstract:
Mother’s milk provides the most superior source of nutrition to a child. There is no other substitute to the mother’s milk. Postpartum secretions like breast milk can be analyzed on the go for testing the presence of any harmful pathogen before a mother can feed the child or donate the milk for the milk bank. Since breast feeding is one of the main causes for transmission of diseases to the newborn, it is mandatory to test the secretions. In this paper, we describe the detection of pathogens like E-coli, Human Immunodeficiency Virus (HIV), Hepatitis B (HBV), Hepatitis C (HCV), Cytomegalovirus (CMV), Zika and Ebola virus through an innovative method, in which we are developing a unique chip for testing the mother’s milk sample. The chip will contain an antibody specific to the target pathogen that will show a color change if there are enough pathogens present in the fluid that will be considered dangerous. A smart-phone camera will then be acquiring the image of the strip and using various image processing techniques we will detect the color development due to antigen antibody interaction within 5 minutes, thereby not adding to any delay, before the newborn is fed or prior to the collection of the milk for the milk bank. If the target pathogen comes positive through this method, then the health care provider can provide adequate treatment to bring down the number of pathogens. This will reduce the postpartum related mortality and morbidity which arises due to feeding infectious breast milk to own child.Keywords: postpartum, fluids, camera, HIV, HCV, CMV, Zika, Ebola, smart-phones, breast milk, pathogens, image processing techniques
Procedia PDF Downloads 2226281 Robust Features for Impulsive Noisy Speech Recognition Using Relative Spectral Analysis
Authors: Hajer Rahali, Zied Hajaiej, Noureddine Ellouze
Abstract:
The goal of speech parameterization is to extract the relevant information about what is being spoken from the audio signal. In speech recognition systems Mel-Frequency Cepstral Coefficients (MFCC) and Relative Spectral Mel-Frequency Cepstral Coefficients (RASTA-MFCC) are the two main techniques used. It will be shown in this paper that it presents some modifications to the original MFCC method. In our work the effectiveness of proposed changes to MFCC called Modified Function Cepstral Coefficients (MODFCC) were tested and compared against the original MFCC and RASTA-MFCC features. The prosodic features such as jitter and shimmer are added to baseline spectral features. The above-mentioned techniques were tested with impulsive signals under various noisy conditions within AURORA databases.Keywords: auditory filter, impulsive noise, MFCC, prosodic features, RASTA filter
Procedia PDF Downloads 4256280 Evaluation of the Performance of Solar Stills as an Alternative for Brine Treatment Applying the Monte Carlo Ray Tracing Method
Authors: B. E. Tarazona-Romero, J. G. Ascanio-Villabona, O. Lengerke-Perez, A. D. Rincon-Quintero, C. L. Sandoval-Rodriguez
Abstract:
Desalination offers solutions for the shortage of water in the world, however, the process of eliminating salts generates a by-product known as brine, generally eliminated in the environment through techniques that mitigate its impact. Brine treatment techniques are vital to developing an environmentally sustainable desalination process. Consequently, this document evaluates three different geometric configurations of solar stills as an alternative for brine treatment to be integrated into a low-scale desalination process. The geometric scenarios to be studied were selected because they have characteristics that adapt to the concept of appropriate technology; low cost, intensive labor and material resources for local manufacturing, modularity, and simplicity in construction. Additionally, the conceptual design of the collectors was carried out, and the ray tracing methodology was applied through the open access software SolTrace and Tonatiuh. The simulation process used 600.00 rays and modified two input parameters; direct normal radiation (DNI) and reflectance. In summary, for the scenarios evaluated, the ladder-type distiller presented higher efficiency values compared to the pyramid-type and single-slope collectors. Finally, the efficiency of the collectors studied was directly related to their geometry, that is, large geometries allow them to receive a greater number of solar rays in various paths, affecting the efficiency of the device.Keywords: appropriate technology, brine treatment techniques, desalination, monte carlo ray tracing
Procedia PDF Downloads 716279 Numerical Modeling for Water Engineering and Obstacle Theory
Authors: Mounir Adal, Baalal Azeddine, Afifi Moulay Larbi
Abstract:
Numerical analysis is a branch of mathematics devoted to the development of iterative matrix calculation techniques. We are searching for operations optimization as objective to calculate and solve systems of equations of order n with time and energy saving for computers that are conducted to calculate and analyze big data by solving matrix equations. Furthermore, this scientific discipline is producing results with a margin of error of approximation called rates. Thus, the results obtained from the numerical analysis techniques that are held on computer software such as MATLAB or Simulink offers a preliminary diagnosis of the situation of the environment or space targets. By this we can offer technical procedures needed for engineering or scientific studies exploitable by engineers for water.Keywords: numerical analysis methods, obstacles solving, engineering, simulation, numerical modeling, iteration, computer, MATLAB, water, underground, velocity
Procedia PDF Downloads 4626278 Quantitative Comparisons of Different Approaches for Rotor Identification
Authors: Elizabeth M. Annoni, Elena G. Tolkacheva
Abstract:
Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia that is a known prognostic marker for stroke, heart failure and death. Reentrant mechanisms of rotor formation, which are stable electrical sources of cardiac excitation, are believed to cause AF. No existing commercial mapping systems have been demonstrated to consistently and accurately predict rotor locations outside of the pulmonary veins in patients with persistent AF. There is a clear need for robust spatio-temporal techniques that can consistently identify rotors using unique characteristics of the electrical recordings at the pivot point that can be applied to clinical intracardiac mapping. Recently, we have developed four new signal analysis approaches – Shannon entropy (SE), Kurtosis (Kt), multi-scale frequency (MSF), and multi-scale entropy (MSE) – to identify the pivot points of rotors. These proposed techniques utilize different cardiac signal characteristics (other than local activation) to uncover the intrinsic complexity of the electrical activity in the rotors, which are not taken into account in current mapping methods. We validated these techniques using high-resolution optical mapping experiments in which direct visualization and identification of rotors in ex-vivo Langendorff-perfused hearts were possible. Episodes of ventricular tachycardia (VT) were induced using burst pacing, and two examples of rotors were used showing 3-sec episodes of a single stationary rotor and figure-8 reentry with one rotor being stationary and one meandering. Movies were captured at a rate of 600 frames per second for 3 sec. with 64x64 pixel resolution. These optical mapping movies were used to evaluate the performance and robustness of SE, Kt, MSF and MSE techniques with respect to the following clinical limitations: different time of recordings, different spatial resolution, and the presence of meandering rotors. To quantitatively compare the results, SE, Kt, MSF and MSE techniques were compared to the “true” rotor(s) identified using the phase map. Accuracy was calculated for each approach as the duration of the time series and spatial resolution were reduced. The time series duration was decreased from its original length of 3 sec, down to 2, 1, and 0.5 sec. The spatial resolution of the original VT episodes was decreased from 64x64 pixels to 32x32, 16x16, and 8x8 pixels by uniformly removing pixels from the optical mapping video.. Our results demonstrate that Kt, MSF and MSE were able to accurately identify the pivot point of the rotor under all three clinical limitations. The MSE approach demonstrated the best overall performance, but Kt was the best in identifying the pivot point of the meandering rotor. Artifacts mildly affect the performance of Kt, MSF and MSE techniques, but had a strong negative impact of the performance of SE. The results of our study motivate further validation of SE, Kt, MSF and MSE techniques using intra-atrial electrograms from paroxysmal and persistent AF patients to see if these approaches can identify pivot points in a clinical setting. More accurate rotor localization could significantly increase the efficacy of catheter ablation to treat AF, resulting in a higher success rate for single procedures.Keywords: Atrial Fibrillation, Optical Mapping, Signal Processing, Rotors
Procedia PDF Downloads 3246277 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 3706276 Iterative Segmentation and Application of Hausdorff Dilation Distance in Defect Detection
Authors: S. Shankar Bharathi
Abstract:
Inspection of surface defects on metallic components has always been challenging due to its specular property. Occurrences of defects such as scratches, rust, pitting are very common in metallic surfaces during the manufacturing process. These defects if unchecked can hamper the performance and reduce the life time of such component. Many of the conventional image processing algorithms in detecting the surface defects generally involve segmentation techniques, based on thresholding, edge detection, watershed segmentation and textural segmentation. They later employ other suitable algorithms based on morphology, region growing, shape analysis, neural networks for classification purpose. In this paper the work has been focused only towards detecting scratches. Global and other thresholding techniques were used to extract the defects, but it proved to be inaccurate in extracting the defects alone. However, this paper does not focus on comparison of different segmentation techniques, but rather describes a novel approach towards segmentation combined with hausdorff dilation distance. The proposed algorithm is based on the distribution of the intensity levels, that is, whether a certain gray level is concentrated or evenly distributed. The algorithm is based on extraction of such concentrated pixels. Defective images showed higher level of concentration of some gray level, whereas in non-defective image, there seemed to be no concentration, but were evenly distributed. This formed the basis in detecting the defects in the proposed algorithm. Hausdorff dilation distance based on mathematical morphology was used to strengthen the segmentation of the defects.Keywords: metallic surface, scratches, segmentation, hausdorff dilation distance, machine vision
Procedia PDF Downloads 4276275 Software Evolution Based Activity Diagrams
Authors: Zine-Eddine Bouras, Abdelouaheb Talai
Abstract:
During the last two decades, the software evolution community has intensively tackled the software merging issue whose main objective is to merge in a consistent way different versions of software in order to obtain a new version. Well-established approaches, mainly based on the dependence analysis techniques, have been used to bring suitable solutions. These approaches concern the source code or software architectures. However, these solutions are more expensive due to the complexity and size. In this paper, we overcome this problem by operating at a high level of abstraction. The objective of this paper is to investigate the software merging at the level of UML activity diagrams, which is a new interesting issue. Its purpose is to merge activity diagrams instead of source code. The proposed approach, based on dependence analysis techniques, is illustrated through an appropriate case study.Keywords: activity diagram, activity diagram slicing, dependency analysis, software merging
Procedia PDF Downloads 3276274 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design
Authors: Qing K. Zhu
Abstract:
Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise
Procedia PDF Downloads 2546273 Convolutional Neural Networks versus Radiomic Analysis for Classification of Breast Mammogram
Authors: Mehwish Asghar
Abstract:
Breast Cancer (BC) is a common type of cancer among women. Its screening is usually performed using different imaging modalities such as magnetic resonance imaging, mammogram, X-ray, CT, etc. Among these modalities’ mammogram is considered a powerful tool for diagnosis and screening of breast cancer. Sophisticated machine learning approaches have shown promising results in complementing human diagnosis. Generally, machine learning methods can be divided into two major classes: one is Radiomics analysis (RA), where image features are extracted manually; and the other one is the concept of convolutional neural networks (CNN), in which the computer learns to recognize image features on its own. This research aims to improve the incidence of early detection, thus reducing the mortality rate caused by breast cancer through the latest advancements in computer science, in general, and machine learning, in particular. It has also been aimed to ease the burden of doctors by improving and automating the process of breast cancer detection. This research is related to a relative analysis of different techniques for the implementation of different models for detecting and classifying breast cancer. The main goal of this research is to provide a detailed view of results and performances between different techniques. The purpose of this paper is to explore the potential of a convolutional neural network (CNN) w.r.t feature extractor and as a classifier. Also, in this research, it has been aimed to add the module of Radiomics for comparison of its results with deep learning techniques.Keywords: breast cancer (BC), machine learning (ML), convolutional neural network (CNN), radionics, magnetic resonance imaging, artificial intelligence
Procedia PDF Downloads 2256272 Non-Invasive Techniques for Management of Carious Primary Dentition Using Silver Diamine Fluoride and Moringa Extract as a Modification of the Hall Technique
Authors: Rasha F. Sharaf
Abstract:
Treatment of dental caries in young children is considered a great challenge for all dentists, especially with uncooperative children. Recently non-invasive techniques have been highlighted as they alleviate the need for local anesthesia and other painful procedures during management of carious teeth and, at the same time, increase the success rate of the treatment done. Silver Diamine Fluoride (SDF) is one of the most effective cariostatic materials that arrest the progression of carious lesions and aid in remineralizing the demineralized tooth structure. Both fluoride and silver ions proved to have an antibacterial action and aid in the precipitation of an insoluble layer that prevents further decay. At the same time, Moringa proved to have an effective antibacterial action against different types of bacteria, therefore, it can be used as a non-invasive technique for the management of caries in children. One of the important theories for the control of caries is by depriving the cariogenic bacteria from nutrients causing their starvation and death, which can be achieved by applying stainless steel crown on primary molars with carious lesions which are not involving the pulp, and this technique is known as Hall technique. The success rate of the Hall technique can be increased by arresting the carious lesion using either SDF or Moringa and gaining the benefit of their antibacterial action. Multiple clinical cases with 1 year follow up will be presented, comparing different treatment options, and using various materials and techniques for non-invasive and non-painful management of carious primary teeth.Keywords: SDF, hall technique, carious primary teeth, moringa extract
Procedia PDF Downloads 96