Search results for: continuous speed profile data
25619 Discrimination of Artificial Intelligence
Authors: Iman Abu-Rub
Abstract:
This research paper examines if Artificial Intelligence is, in fact, racist or not. Different studies from all around the world, and covering different communities were analyzed to further understand AI’s true implications over different communities. The black community, Asian community, and Muslim community were all analyzed and discussed in the paper to figure out if AI is biased or unbiased towards these specific communities. It was found that the biggest problem AI faces is the biased distribution of data collection. Most of the data inserted and coded into AI are of a white male, which significantly affects the other communities in terms of reliable cultural, political, or medical research. Nonetheless, there are various research was done that help increase awareness of this issue, but also solve it completely if done correctly. Governments and big corporations are able to implement different strategies into their AI inventions to avoid any racist results, which could cause hatred culturally but also unreliable data, medically, for example. Overall, Artificial Intelligence is not racist per se, but the data implementation and current racist culture online manipulate AI to become racist.Keywords: social media, artificial intelligence, racism, discrimination
Procedia PDF Downloads 11625618 A Neural Network Modelling Approach for Predicting Permeability from Well Logs Data
Authors: Chico Horacio Jose Sambo
Abstract:
Recently neural network has gained popularity when come to solve complex nonlinear problems. Permeability is one of fundamental reservoir characteristics system that are anisotropic distributed and non-linear manner. For this reason, permeability prediction from well log data is well suited by using neural networks and other computer-based techniques. The main goal of this paper is to predict reservoir permeability from well logs data by using neural network approach. A multi-layered perceptron trained by back propagation algorithm was used to build the predictive model. The performance of the model on net results was measured by correlation coefficient. The correlation coefficient from testing, training, validation and all data sets was evaluated. The results show that neural network was capable of reproducing permeability with accuracy in all cases, so that the calculated correlation coefficients for training, testing and validation permeability were 0.96273, 0.89991 and 0.87858, respectively. The generalization of the results to other field can be made after examining new data, and a regional study might be possible to study reservoir properties with cheap and very fast constructed models.Keywords: neural network, permeability, multilayer perceptron, well log
Procedia PDF Downloads 40325617 The Contribution of Density Fluctuations in Ultrasound Scattering in Cancellous Bone
Authors: A. Elsariti, T. Evans
Abstract:
An understanding of the interaction between acoustic waves and cancellous bone is needed in order to realize the full clinical potential of ultrasonic bone measurements. Scattering is likely to be of central importance but has received little attention to date. Few theoretical approaches have been described to explain scattering of ultrasound from bone. In this study, a scattering model based on velocity and density fluctuations in a binary mixture (marrow fat and cortical matrix) was used to estimate the ultrasonic attenuation in cancellous bone as a function of volume fraction. Predicted attenuation and backscatter coefficient were obtained for a range of porosities and scatterer size. At 600 kHZ and for different scatterer size the effect of velocity and density fluctuations in the predicted attenuation was approximately 60% higher than velocity fluctuations.Keywords: ultrasound scattering, sound speed, density fluctuations, attenuation coefficient
Procedia PDF Downloads 32625616 Application of Regularized Spatio-Temporal Models to the Analysis of Remote Sensing Data
Authors: Salihah Alghamdi, Surajit Ray
Abstract:
Space-time data can be observed over irregularly shaped manifolds, which might have complex boundaries or interior gaps. Most of the existing methods do not consider the shape of the data, and as a result, it is difficult to model irregularly shaped data accommodating the complex domain. We used a method that can deal with space-time data that are distributed over non-planner shaped regions. The method is based on partial differential equations and finite element analysis. The model can be estimated using a penalized least squares approach with a regularization term that controls the over-fitting. The model is regularized using two roughness penalties, which consider the spatial and temporal regularities separately. The integrated square of the second derivative of the basis function is used as temporal penalty. While the spatial penalty consists of the integrated square of Laplace operator, which is integrated exclusively over the domain of interest that is determined using finite element technique. In this paper, we applied a spatio-temporal regression model with partial differential equations regularization (ST-PDE) approach to analyze a remote sensing data measuring the greenness of vegetation, measure by an index called enhanced vegetation index (EVI). The EVI data consist of measurements that take values between -1 and 1 reflecting the level of greenness of some region over a period of time. We applied (ST-PDE) approach to irregular shaped region of the EVI data. The approach efficiently accommodates the irregular shaped regions taking into account the complex boundaries rather than smoothing across the boundaries. Furthermore, the approach succeeds in capturing the temporal variation in the data.Keywords: irregularly shaped domain, partial differential equations, finite element analysis, complex boundray
Procedia PDF Downloads 14025615 Utilising an Online Data Collection Platform for the Development of a Community Engagement Database: A Case Study on Building Inter-Institutional Partnerships at UWC
Authors: P. Daniels, T. Adonis, P. September-Brown, R. Comalie
Abstract:
The community engagement unit at the University of the Western Cape was tasked with establishing a community engagement database. The database would store information of all community engagement projects related to the university. The wealth of knowledge obtained from the various disciplines would be used to facilitate interdisciplinary collaboration within the university, as well as facilitating community university partnership opportunities. The purpose of this qualitative study was to explore electronic data collection through the development of a database. Two types of electronic data collection platforms were used, namely online questionnaire and email. The semi structured questionnaire was used to collect data related to community engagement projects from different faculties and departments at the university. There are many benefits for using an electronic data collection platform, such as reduction of costs and time, ease in reaching large numbers of potential respondents, and the possibility of providing anonymity to participants. Despite all the advantages of using the electronic platform, there were as many challenges, as depicted in our findings. The findings suggest that certain barriers existed by using an electronic platform for data collection, even though it was in an academic environment, where knowledge and resources were in abundance. One of the challenges experienced in this process was the lack of dissemination of information via email to staff within faculties. The actual online software used for the questionnaire had its own limitations, such as only being able to access the questionnaire from the same electronic device. In a few cases, academics only completed the questionnaire after a telephonic prompt or face to face meeting about "Is higher education in South Africa ready to embrace electronic platform in data collection?"Keywords: community engagement, database, data collection, electronic platform, electronic tools, knowledge sharing, university
Procedia PDF Downloads 26425614 Women Entrepreneurial Resiliency Amidst COVID-19
Authors: Divya Juneja, Sukhjeet Kaur Matharu
Abstract:
Purpose: The paper is aimed at identifying the challenging factors experienced by the women entrepreneurs in India in operating their enterprises amidst the challenges posed by the COVID-19 pandemic. Methodology: The sample for the study comprised 396 women entrepreneurs from different regions of India. A purposive sampling technique was adopted for data collection. Data was collected through a self-administered questionnaire. Analysis was performed using the SPSS package for quantitative data analysis. Findings: The results of the study state that entrepreneurial characteristics, resourcefulness, networking, adaptability, and continuity have a positive influence on the resiliency of women entrepreneurs when faced with a crisis situation. Practical Implications: The findings of the study have some important implications for women entrepreneurs, organizations, government, and other institutions extending support to entrepreneurs.Keywords: women entrepreneurs, analysis, data analysis, positive influence, resiliency
Procedia PDF Downloads 11425613 Partial Least Square Regression for High-Dimentional and High-Correlated Data
Authors: Mohammed Abdullah Alshahrani
Abstract:
The research focuses on investigating the use of partial least squares (PLS) methodology for addressing challenges associated with high-dimensional correlated data. Recent technological advancements have led to experiments producing data characterized by a large number of variables compared to observations, with substantial inter-variable correlations. Such data patterns are common in chemometrics, where near-infrared (NIR) spectrometer calibrations record chemical absorbance levels across hundreds of wavelengths, and in genomics, where thousands of genomic regions' copy number alterations (CNA) are recorded from cancer patients. PLS serves as a widely used method for analyzing high-dimensional data, functioning as a regression tool in chemometrics and a classification method in genomics. It handles data complexity by creating latent variables (components) from original variables. However, applying PLS can present challenges. The study investigates key areas to address these challenges, including unifying interpretations across three main PLS algorithms and exploring unusual negative shrinkage factors encountered during model fitting. The research presents an alternative approach to addressing the interpretation challenge of predictor weights associated with PLS. Sparse estimation of predictor weights is employed using a penalty function combining a lasso penalty for sparsity and a Cauchy distribution-based penalty to account for variable dependencies. The results demonstrate sparse and grouped weight estimates, aiding interpretation and prediction tasks in genomic data analysis. High-dimensional data scenarios, where predictors outnumber observations, are common in regression analysis applications. Ordinary least squares regression (OLS), the standard method, performs inadequately with high-dimensional and highly correlated data. Copy number alterations (CNA) in key genes have been linked to disease phenotypes, highlighting the importance of accurate classification of gene expression data in bioinformatics and biology using regularized methods like PLS for regression and classification.Keywords: partial least square regression, genetics data, negative filter factors, high dimensional data, high correlated data
Procedia PDF Downloads 4925612 The Use of Voice in Online Public Access Catalog as Faster Searching Device
Authors: Maisyatus Suadaa Irfana, Nove Eka Variant Anna, Dyah Puspitasari Sri Rahayu
Abstract:
Technological developments provide convenience to all the people. Nowadays, the communication of human with the computer is done via text. With the development of technology, human and computer communications have been conducted with a voice like communication between human beings. It provides an easy facility for many people, especially those who have special needs. Voice search technology is applied in the search of book collections in the OPAC (Online Public Access Catalog), so library visitors will find it faster and easier to find books that they need. Integration with Google is needed to convert the voice into text. To optimize the time and the results of searching, Server will download all the book data that is available in the server database. Then, the data will be converted into JSON format. In addition, the incorporation of some algorithms is conducted including Decomposition (parse) in the form of array of JSON format, the index making, analyzer to the result. It aims to make the process of searching much faster than the usual searching in OPAC because the data are directly taken to the database for every search warrant. Data Update Menu is provided with the purpose to enable users perform their own data updates and get the latest data information.Keywords: OPAC, voice, searching, faster
Procedia PDF Downloads 34425611 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models
Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu
Abstract:
Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.Keywords: DTM, Unmanned Aerial Vehicle (UAV), uniform, random, kriging
Procedia PDF Downloads 15525610 Fully Printed Strain Gauges: A Comparison of Aerosoljet-Printing and Micropipette-Dispensing
Authors: Benjamin Panreck, Manfred Hild
Abstract:
Strain sensors based on a change in resistance are well established for the measurement of forces, stresses, or material fatigue. Within the scope of this paper, fully additive manufactured strain sensors were produced using an ink of silver nanoparticles. Their behavior was evaluated by periodic tensile tests. Printed strain sensors exhibit two advantages: Their measuring grid is adaptable to the use case and they do not need a carrier-foil, as the measuring structure can be printed directly onto a thin sprayed varnish layer on the aluminum specimen. In order to compare quality characteristics, the sensors have been manufactured using two different technologies, namely aerosoljet-printing and micropipette-dispensing. Both processes produce structures which exhibit continuous features (in contrast to what can be achieved with droplets during inkjet printing). Briefly summarized the results show that aerosoljet-printing is the preferable technology for specimen with non-planar surfaces whereas both technologies are suitable for flat specimen.Keywords: aerosoljet-printing, micropipette-dispensing, printed electronics, printed sensors, strain gauge
Procedia PDF Downloads 20325609 Theoretical Density Study of Winding Yarns on Spool
Authors: Bachir Chemani, Rachid Halfaoui
Abstract:
The aim of work is to define the distribution density of winding yarn on cylindrical and conical bobbins. It is known that parallel winding gives greater density and more regular distribution, but the unwinding of yarn is much more difficult for following process. The conical spool has an enormous advantage during unwinding and may contain a large amount of yarns, but the density distribution is not regular because of difference in diameters. The variation of specific density over the reel height is explained generally by the sudden change of winding speed due to direction movement variation of yarn. We determined the conditions of uniform winding and developed a calculate model to the change of the specific density of winding wire over entire spool height.Keywords: textile, cylindrical bobbins, conical bobbins, parallel winding, cross winding
Procedia PDF Downloads 37725608 Leaching of Flotation Concentrate of Oxide Copper Ore from Sepon Mine, Lao PDR
Authors: C. Rattanakawin, S. Vasailor
Abstract:
Acid leaching of flotation concentrate of oxide copper ore containing mainly of malachite was performed in a standard agitation tank with various parameters. The effects of solid to liquid ratio, sulfuric acid concentration, agitation speed, leaching temperature and time were examined to get proper conditions. The best conditions are 1:8 solid to liquid ratio, 10% concentration by weight, 250 rev/min, 30 oC and 5-min leaching time in respect. About 20% Cu grade assayed by atomic absorption technique with 98% copper recovery was obtained from these combined optimum conditions. Dissolution kinetics of the concentrate was approximated as a logarithmic function. As a result, the first-order reaction rate is suggested from this leaching study.Keywords: agitation leaching, dissolution kinetics, flotation concentrate, oxide copper ore, sulfuric acid
Procedia PDF Downloads 11925607 Heterogeneity, Asymmetry and Extreme Risk Perception; Dynamic Evolution Detection From Implied Risk Neutral Density
Authors: Abderrahmen Aloulou, Younes Boujelbene
Abstract:
The current paper displays a new method of extracting information content from options prices by eliminating biases caused by daily variation of contract maturity. Based on Kernel regression tool, this non-parametric technique serves to obtain a spectrum of interpolated options with constant maturity horizons from negotiated optional contracts on the S&P TSX 60 index. This method makes it plausible to compare daily risk neutral densities from which extracting time continuous indicators allows the detection traders attitudes’ evolution, such as, belief homogeneity, asymmetry and extreme Risk Perception. Our findings indicate that the applied method contribute to develop effective trading strategies and to adjust monetary policies through controlling trader’s reactions to economic and monetary news.Keywords: risk neutral densities, kernel, constant maturity horizons, homogeneity, asymmetry and extreme risk perception
Procedia PDF Downloads 48625606 Private and Public Health Sector Difference on Client Satisfaction: Results from Secondary Data Analysis in Sindh, Pakistan
Authors: Wajiha Javed, Arsalan Jabbar, Nelofer Mehboob, Muhammad Tafseer, Zahid Memon
Abstract:
Introduction: Researchers globally have strived to explore diverse factors that augment the continuation and uptake of family planning methods. Clients’ satisfaction is one of the core determinants facilitating continuation of family planning methods. There is a major debate yet scanty evidence to contrast public and private sectors with respect to client satisfaction. The objective of this study is to compare quality-of-care provided by public and private sectors of Pakistan through a client satisfaction lens. Methods: We used Pakistan Demographic Heath Survey 2012-13 dataset (Sindh province) on a total of 3133 Married Women of Reproductive Age (MWRA) aged 15-49 years. Source of family planning (public/private sector) was the main exposure variable. Outcome variable was client satisfaction judged by ten different dimensions of client satisfaction. Means and standard deviations were calculated for continuous variable while for categorical variable frequencies and percentages were computed. For univariate analysis, Chi-square/Fisher Exact test was used to find an association between clients’ satisfaction in public and private sectors. Ten different multivariate models were made. Variables were checked for multi-collinearity, confounding, and interaction, and then advanced logistic regression was used to explore the relationship between client satisfaction and dependent outcome after adjusting for all known confounding factors and results are presented as OR and AOR (95% CI). Results: Multivariate analyses showed that clients were less satisfied in contraceptive provision from private sector as compared to public sector (AOR 0.92,95% CI 0.63-1.68) even though the result was not statistically significant. Clients were more satisfied from private sector as compared to the public sector with respect to other determinants of quality-of-care (follow-up care (AOR 3.29, 95% CI 1.95-5.55), infection prevention (AOR 2.41, 95% CI 1.60-3.62), counseling services (AOR 2.01, 95% CI 1.27-3.18, timely treatment (AOR 3.37, 95% CI 2.20-5.15), attitude of staff (AOR 2.23, 95% CI 1.50-3.33), punctuality of staff (AOR 2.28, 95% CI 1.92-4.13), timely referring (AOR 2.34, 95% CI 1.63-3.35), staff cooperation (AOR 1.75, 95% CI 1.22-2.51) and complications handling (AOR 2.27, 95% CI 1.56-3.29).Keywords: client satisfaction, family planning, public private partnership, quality of care
Procedia PDF Downloads 41925605 Nonlocal Phenomena in Quantum Mechanics
Authors: Kazim G. Atman, Hüseyin Sirin
Abstract:
In theoretical physics, nonlocal phenomena has always been subject of debate. However, in the conventional mathematical approach where the developments of the physical systems are investigated by using the standard mathematical tools, nonlocal effects are not taken into account. In order to investigate the nonlocality in quantum mechanics and fractal property of space, fractional derivative operators are employed in this study. In this manner, fractional creation and annihilation operators are introduced and Einstein coefficients are taken into account as an application of concomitant formalism in quantum field theory. Therefore, each energy mode of photons are considered as fractional quantized harmonic oscillator hereby Einstein coefficients are obtained. Nevertheless, wave function and energy eigenvalues of fractional quantum mechanical harmonic oscillator are obtained via the fractional derivative order α which is a measure of the influence of nonlocal effects. In the case α = 1, where space becomes homogeneous and continuous, standard physical conclusions are recovered.Keywords: Einstein’s Coefficients, Fractional Calculus, Fractional Quantum Mechanics, Nonlocal Theories
Procedia PDF Downloads 17025604 Exploring Influence Range of Tainan City Using Electronic Toll Collection Big Data
Authors: Chen Chou, Feng-Tyan Lin
Abstract:
Big Data has been attracted a lot of attentions in many fields for analyzing research issues based on a large number of maternal data. Electronic Toll Collection (ETC) is one of Intelligent Transportation System (ITS) applications in Taiwan, used to record starting point, end point, distance and travel time of vehicle on the national freeway. This study, taking advantage of ETC big data, combined with urban planning theory, attempts to explore various phenomena of inter-city transportation activities. ETC, one of government's open data, is numerous, complete and quick-update. One may recall that living area has been delimited with location, population, area and subjective consciousness. However, these factors cannot appropriately reflect what people’s movement path is in daily life. In this study, the concept of "Living Area" is replaced by "Influence Range" to show dynamic and variation with time and purposes of activities. This study uses data mining with Python and Excel, and visualizes the number of trips with GIS to explore influence range of Tainan city and the purpose of trips, and discuss living area delimited in current. It dialogues between the concepts of "Central Place Theory" and "Living Area", presents the new point of view, integrates the application of big data, urban planning and transportation. The finding will be valuable for resource allocation and land apportionment of spatial planning.Keywords: Big Data, ITS, influence range, living area, central place theory, visualization
Procedia PDF Downloads 27925603 Performance Analysis of Hierarchical Agglomerative Clustering in a Wireless Sensor Network Using Quantitative Data
Authors: Tapan Jain, Davender Singh Saini
Abstract:
Clustering is a useful mechanism in wireless sensor networks which helps to cope with scalability and data transmission problems. The basic aim of our research work is to provide efficient clustering using Hierarchical agglomerative clustering (HAC). If the distance between the sensing nodes is calculated using their location then it’s quantitative HAC. This paper compares the various agglomerative clustering techniques applied in a wireless sensor network using the quantitative data. The simulations are done in MATLAB and the comparisons are made between the different protocols using dendrograms.Keywords: routing, hierarchical clustering, agglomerative, quantitative, wireless sensor network
Procedia PDF Downloads 61525602 A Novel Hybrid Deep Learning Architecture for Predicting Acute Kidney Injury Using Patient Record Data and Ultrasound Kidney Images
Authors: Sophia Shi
Abstract:
Acute kidney injury (AKI) is the sudden onset of kidney damage in which the kidneys cannot filter waste from the blood, requiring emergency hospitalization. AKI patient mortality rate is high in the ICU and is virtually impossible for doctors to predict because it is so unexpected. Currently, there is no hybrid model predicting AKI that takes advantage of two types of data. De-identified patient data from the MIMIC-III database and de-identified kidney images and corresponding patient records from the Beijing Hospital of the Ministry of Health were collected. Using data features including serum creatinine among others, two numeric models using MIMIC and Beijing Hospital data were built, and with the hospital ultrasounds, an image-only model was built. Convolutional neural networks (CNN) were used, VGG and Resnet for numeric data and Resnet for image data, and they were combined into a hybrid model by concatenating feature maps of both types of models to create a new input. This input enters another CNN block and then two fully connected layers, ending in a binary output after running through Softmax and additional code. The hybrid model successfully predicted AKI and the highest AUROC of the model was 0.953, achieving an accuracy of 90% and F1-score of 0.91. This model can be implemented into urgent clinical settings such as the ICU and aid doctors by assessing the risk of AKI shortly after the patient’s admission to the ICU, so that doctors can take preventative measures and diminish mortality risks and severe kidney damage.Keywords: Acute kidney injury, Convolutional neural network, Hybrid deep learning, Patient record data, ResNet, Ultrasound kidney images, VGG
Procedia PDF Downloads 13125601 X-Ray Detector Technology Optimization in Computed Tomography
Authors: Aziz Ikhlef
Abstract:
Most of multi-slices Computed Tomography (CT) scanners are built with detectors composed of scintillator - photodiodes arrays. The photodiodes arrays are mainly based on front-illuminated technology for detectors under 64 slices and on back-illuminated photodiode for systems of 64 slices or more. The designs based on back-illuminated photodiodes were being investigated for CT machines to overcome the challenge of the higher number of runs and connection required in front-illuminated diodes. In backlit diodes, the electronic noise has already been improved because of the reduction of the load capacitance due to the routing reduction. This is translated by a better image quality in low signal application, improving low dose imaging in large patient population. With the fast development of multi-detector-rows CT (MDCT) scanners and the increasing number of examinations, the clinical community has raised significant concerns on radiation dose received by the patient in both medical and regulatory community. In order to reduce individual exposure and in response to the recommendations of the International Commission on Radiological Protection (ICRP) which suggests that all exposures should be kept as low as reasonably achievable (ALARA), every manufacturer is trying to implement strategies and solutions to optimize dose efficiency and image quality based on x-ray emission and scanning parameters. The added demands on the CT detector performance also comes from the increased utilization of spectral CT or dual-energy CT in which projection data of two different tube potentials are collected. One of the approaches utilizes a technology called fast-kVp switching in which the tube voltage is switched between 80 kVp and 140 kVp in fraction of a millisecond. To reduce the cross-contamination of signals, the scintillator based detector temporal response has to be extremely fast to minimize the residual signal from previous samples. In addition, this paper will present an overview of detector technologies and image chain improvement which have been investigated in the last few years to improve the signal-noise ratio and the dose efficiency CT scanners in regular examinations and in energy discrimination techniques. Several parameters of the image chain in general and in the detector technology contribute in the optimization of the final image quality. We will go through the properties of the post-patient collimation to improve the scatter-to-primary ratio, the scintillator material properties such as light output, afterglow, primary speed, crosstalk to improve the spectral imaging, the photodiode design characteristics and the data acquisition system (DAS) to optimize for crosstalk, noise and temporal/spatial resolution.Keywords: computed tomography, X-ray detector, medical imaging, image quality, artifacts
Procedia PDF Downloads 19425600 Effect of the Initial Billet Shape Parameters on the Final Product in a Backward Extrusion Process for Pressure Vessels
Authors: Archana Thangavelu, Han-Ik Park, Young-Chul Park, Joon-Hong Park
Abstract:
In this numerical study, we have proposed a method for evaluation of backward extrusion process of pressure vessel made up of steel. Demand for lighter and stiffer products have been increasing in the last years especially in automobile engineering. Through detailed finite element analysis, effective stress, strain and velocity profile have been obtained with optimal range. The process design of a forward and backward extrusion axe-symmetric part has been studied. Forging is mainly carried out because forged products are highly reliable and possess superior mechanical properties when compared to normal products. Performing computational simulations of 3D hot forging with various dimensions of billet and optimization of weight is carried out using Taguchi Orthogonal Array (OA) Optimization technique. The technique used in this study can be used for newly developed materials to investigate its forgeability for much complicated shapes in closed hot die forging process.Keywords: backward extrusion, hot forging, optimization, finite element analysis, Taguchi method
Procedia PDF Downloads 30925599 Qualitative Data Analysis for Health Care Services
Authors: Taner Ersoz, Filiz Ersoz
Abstract:
This study was designed enable application of multivariate technique in the interpretation of categorical data for measuring health care services satisfaction in Turkey. The data was collected from a total of 17726 respondents. The establishment of the sample group and collection of the data were carried out by a joint team from The Ministry of Health and Turkish Statistical Institute (Turk Stat) of Turkey. The multiple correspondence analysis (MCA) was used on the data of 2882 respondents who answered the questionnaire in full. The multiple correspondence analysis indicated that, in the evaluation of health services females, public employees, younger and more highly educated individuals were more concerned and complainant than males, private sector employees, older and less educated individuals. Overall 53 % of the respondents were pleased with the improvements in health care services in the past three years. This study demonstrates the public consciousness in health services and health care satisfaction in Turkey. It was found that most the respondents were pleased with the improvements in health care services over the past three years. Awareness of health service quality increases with education levels. Older individuals and males would appear to have lower expectancies in health services.Keywords: multiple correspondence analysis, multivariate categorical data, health care services, health satisfaction survey
Procedia PDF Downloads 24225598 Modified RSA in Mobile Communication
Authors: Nagaratna Rajur, J. D. Mallapur, Y. B. Kirankumar
Abstract:
The security in mobile communication is very different from the internet or telecommunication, because of its poor user interface and limited processing capacity, as well as combination of complex network protocols. Hence, it poses a challenge for less memory usage and low computation speed based security system. Security involves all the activities that are undertaken to protect the value and on-going usability of assets and the integrity and continuity of operations. An effective network security strategies requires identifying threats and then choosing the most effective set of tools to combat them. Cryptography is a simple and efficient way to provide security in communication. RSA is an asymmetric key approach that is highly reliable and widely used in internet communication. However, it has not been efficiently implemented in mobile communication due its computational complexity and large memory utilization. The proposed algorithm modifies the current RSA to be useful in mobile communication by reducing its computational complexity and memory utilization.Keywords: M-RSA, sensor networks, sensor applications, security
Procedia PDF Downloads 34225597 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions
Authors: Valerii Dashuk
Abstract:
The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function
Procedia PDF Downloads 17525596 Chatter Suppression in Boring Process Using Passive Damper
Authors: V. Prasannavenkadesan, A. Elango, S. Chockalingam
Abstract:
During machining process, chatter is an unavoidable phenomenon. Boring bars possess the cantilever shape and due to this, it is subjected to chatter. The adverse effect of chatter includes the increase in temperature which will leads to excess tool wear. To overcome these problems, in this investigation, Cartridge brass (Cu – 70% and Zn – 30%) is passively fixed on the boring bar and also clearance is provided in order to reduce the displacement, tool wear and cutting temperature. A conventional all geared lathe is attached with vibrometer and pyrometer is used to measure the displacement and temperature. The influence of input parameters such as cutting speed, depth of cut and clearance on temperature, tool wear and displacement are investigated for various cutting conditions. From the result, the optimum conditions to obtain better damping in boring process for chatter reduction is identified.Keywords: boring, chatter, mass damping, passive damping
Procedia PDF Downloads 35025595 Development of a Numerical Model to Predict Wear in Grouted Connections for Offshore Wind Turbine Generators
Authors: Paul Dallyn, Ashraf El-Hamalawi, Alessandro Palmeri, Bob Knight
Abstract:
In order to better understand the long term implications of the grout wear failure mode in large-diameter plain-sided grouted connections, a numerical model has been developed and calibrated that can take advantage of existing operational plant data to predict the wear accumulation for the actual load conditions experienced over a given period, thus limiting the need for expensive monitoring systems. This model has been derived and calibrated based on site structural condition monitoring (SCM) data and supervisory control and data acquisition systems (SCADA) data for two operational wind turbine generator substructures afflicted with this challenge, along with experimentally derived wear rates.Keywords: grouted connection, numerical model, offshore structure, wear, wind energy
Procedia PDF Downloads 45325594 Multimodal Deep Learning for Human Activity Recognition
Authors: Ons Slimene, Aroua Taamallah, Maha Khemaja
Abstract:
In recent years, human activity recognition (HAR) has been a key area of research due to its diverse applications. It has garnered increasing attention in the field of computer vision. HAR plays an important role in people’s daily lives as it has the ability to learn advanced knowledge about human activities from data. In HAR, activities are usually represented by exploiting different types of sensors, such as embedded sensors or visual sensors. However, these sensors have limitations, such as local obstacles, image-related obstacles, sensor unreliability, and consumer concerns. Recently, several deep learning-based approaches have been proposed for HAR and these approaches are classified into two categories based on the type of data used: vision-based approaches and sensor-based approaches. This research paper highlights the importance of multimodal data fusion from skeleton data obtained from videos and data generated by embedded sensors using deep neural networks for achieving HAR. We propose a deep multimodal fusion network based on a twostream architecture. These two streams use the Convolutional Neural Network combined with the Bidirectional LSTM (CNN BILSTM) to process skeleton data and data generated by embedded sensors and the fusion at the feature level is considered. The proposed model was evaluated on a public OPPORTUNITY++ dataset and produced a accuracy of 96.77%.Keywords: human activity recognition, action recognition, sensors, vision, human-centric sensing, deep learning, context-awareness
Procedia PDF Downloads 10125593 Improving the Residence Time of a Rectangular Contact Tank by Varying the Geometry Using Numerical Modeling
Authors: Yamileth P. Herrera, Ronald R. Gutierrez, Carlos, Pacheco-Bustos
Abstract:
This research aims at the numerical modeling of a rectangular contact tank in order to improve the hydrodynamic behavior and the retention time of the water to be treated with the disinfecting agent. The methodology to be followed includes a hydraulic analysis of the tank to observe the fluid velocities, which will allow evidence of low-speed areas that may generate pathogenic agent incubation or high-velocity areas, which may decrease the optimal contact time between the disinfecting agent and the microorganisms to be eliminated. Based on the results of the numerical model, the efficiency of the tank under the geometric and hydraulic conditions considered will be analyzed. This would allow the performance of the tank to be improved before starting a construction process, thus avoiding unnecessary costs.Keywords: contact tank, numerical models, hydrodynamic modeling, residence time
Procedia PDF Downloads 16825592 Experimental Study Analysis of Flow over Pickup Truck’s Cargo Area Using Bed Covers
Authors: Jonathan Rodriguez, Dominga Guerrero, Surupa Shaw
Abstract:
Automobiles are modeled in various forms, and they interact with air when in motion. Aerodynamics is the study of such interactions where solid bodies affect the way air moves around them. The shape of solid bodies can impact the ease at which they move against the flow of air; due to which any additional freightage, or loads, impact its aerodynamics. It is important to transport people and cargo safely. Despite the various safety measures, there are a large number of vehicle-related accidents. This study precisely explores the effects an automobile experiences, with added cargo and covers. The addition of these items changes the original vehicle shape and the approved design for safe driving. This paper showcases the effects of the changed vehicle shape and design via experimental testing conducted on a physical 1:27 scale and CAD model of an F-150 pickup truck, the most common pickup truck in the United States, with differently shaped loads and weight traveling at a constant speed. The additional freightage produces unwanted drag or lift resulting in lower fuel efficiencies and unsafe driving conditions. This study employs an adjustable external shell on the F-150 pickup truck to create a controlled aerodynamic geometry to combat the detrimental effects of additional freightage. The results utilize colored powder [ which acts as a visual medium for the interaction of air with the vehicle], to highlight the impact of the additional freight on the automobile’s external shell. This will be done along with simulation models using Altair CFD software of twelve cases regarding the effects of an added load onto an F-150 pickup truck. This paper is an attempt toward standardizing the geometric design of the external shell, given the uniqueness of every load and its placement on the vehicle; while providing real-time data to be compared to simulation results from the existing literature.Keywords: aerodynamics, CFD, freightage, pickup cover
Procedia PDF Downloads 16825591 Surveying Energy Dissipation in Stepped Spillway Using Finite Element Modeling
Authors: Mehdi Fuladipanah
Abstract:
Stepped spillway includes several steps from the crest to the toe. The steps of stepped spillway could cause to decrease the energy with making energy distribution in the longitude mode and also to reduce the outcome speed. The aim of this study was to stimulate the stepped spillway combined with stilling basin-step using Fluent model and the turbulent superficial flow using RNG, K-ε. The free surface of the flow was monitored by VOF model. The velocity and the depth of the flow were measured by tail water depth by the numerical model and then the dissipated energy was calculated along the spillway. The results indicated that the stilling basin-step complex may cause energy dissipation increment in the stepped spillway. Also, the numerical model was suggested as an effective method to predict the circular and complicated flows in the stepped spillways.Keywords: stepped spillway, fluent model, VOF model, K-ε model, energy distribution
Procedia PDF Downloads 37225590 Modeling Child Development Factors for the Early Introduction of ICTs in Schools
Authors: K. E. Oyetade, S. D. Eyono Obono
Abstract:
One of the fundamental characteristics of Information and Communication Technology (ICT) has been the ever-changing nature of continuous release and models of ICTs with its impact on the academic, social, and psychological benefits of its introduction in schools. However, there seems to be a growing concern about its negative impact on students when introduced early in schools for teaching and learning. This study aims to design a model of child development factors affecting the early introduction of ICTs in schools in an attempt to improve the understanding of child development and introduction of ICTs in schools. The proposed model is based on a sound theoretical framework. It was designed following a literature review of child development theories and child development factors. The child development theoretical framework that fitted to the best of all child development factors was then chosen as the basis for the proposed model. This study hence found that the Jean Piaget cognitive developmental theory is the most adequate theoretical frameworks for modeling child development factors for ICT introduction in schools.Keywords: child development factors, child development theories, ICTs, theory
Procedia PDF Downloads 413