Search results for: approximate computing
732 Optimization of Robot Motion Planning Using Biogeography Based Optimization (Bbo)
Authors: Jaber Nikpouri, Arsalan Amralizadeh
Abstract:
In robotics manipulators, the trajectory should be optimum, thus the torque of the robot can be minimized in order to save power. This paper includes an optimal path planning scheme for a robotic manipulator. Recently, techniques based on metaheuristics of natural computing, mainly evolutionary algorithms (EA), have been successfully applied to a large number of robotic applications. In this paper, the improved BBO algorithm is used to minimize the objective function in the presence of different obstacles. The simulation represents that the proposed optimal path planning method has satisfactory performance.Keywords: biogeography-based optimization, path planning, obstacle detection, robotic manipulator
Procedia PDF Downloads 301731 Geochemistry Identification of Volcanic Rocks Product of Krakatau Volcano Eruption for Katastropis Mitigation Planning
Authors: Agil Gemilang Ramadhan, Novian Triandanu
Abstract:
Since 1929, the first appearance in sea level, Anak Krakatau volcano growth relatively quickly. During the 80 years up to 2010 has reached the height of 320 meter above sea level. The possibility of catastrophic explosive eruption could happen again if the chemical composition of rocks from the eruption changed from alkaline magma into acid magma. Until now Anak Krakatau volcanic activity is still quite active as evidenced by the frequency of eruptions that produced ash sized pyroclastic deposits - bomb. Purpose of this study was to identify changes in the percentage of rock geochemistry any results eruption of Anak Krakatau volcano to see consistency change the percentage content of silica in the magma that affect the type of volcanic eruptions. Results from this study will be produced in the form of a diagram the data changes the chemical composition of rocks of Anak Krakatau volcano. Changes in the composition of any silica eruption are illustrated in a graph. If the increase in the percentage of silica is happening consistently and it is assumed to increase in the time scale of a few percent, then to achieve silica content of 68 % (acid composition) that will produce an explosive eruption will know the approximate time. All aspects of the factors driving the increased threat of danger to the public should be taken into account. Catastrophic eruption katatropis mitigation can be planned early so that when these disasters happen later, casualties can be minimized.Keywords: Krakatau volcano, rock geochemistry, catastrophic eruption, mitigation
Procedia PDF Downloads 281730 Analysis of Histogram Asymmetry for Waste Recognition
Authors: Janusz Bobulski, Kamila Pasternak
Abstract:
Despite many years of effort and research, the problem of waste management is still current. So far, no fully effective waste management system has been developed. Many programs and projects improve statistics on the percentage of waste recycled every year. In these efforts, it is worth using modern Computer Vision techniques supported by artificial intelligence. In the article, we present a method of identifying plastic waste based on the asymmetry analysis of the histogram of the image containing the waste. The method is simple but effective (94%), which allows it to be implemented on devices with low computing power, in particular on microcomputers. Such de-vices will be used both at home and in waste sorting plants.Keywords: waste management, environmental protection, image processing, computer vision
Procedia PDF Downloads 119729 An Efficient Backward Semi-Lagrangian Scheme for Nonlinear Advection-Diffusion Equation
Authors: Soyoon Bak, Sunyoung Bu, Philsu Kim
Abstract:
In this paper, a backward semi-Lagrangian scheme combined with the second-order backward difference formula is designed to calculate the numerical solutions of nonlinear advection-diffusion equations. The primary aims of this paper are to remove any iteration process and to get an efficient algorithm with the convergence order of accuracy 2 in time. In order to achieve these objects, we use the second-order central finite difference and the B-spline approximations of degree 2 and 3 in order to approximate the diffusion term and the spatial discretization, respectively. For the temporal discretization, the second order backward difference formula is applied. To calculate the numerical solution of the starting point of the characteristic curves, we use the error correction methodology developed by the authors recently. The proposed algorithm turns out to be completely iteration-free, which resolves the main weakness of the conventional backward semi-Lagrangian method. Also, the adaptability of the proposed method is indicated by numerical simulations for Burgers’ equations. Throughout these numerical simulations, it is shown that the numerical results are in good agreement with the analytic solution and the present scheme offer better accuracy in comparison with other existing numerical schemes. Semi-Lagrangian method, iteration-free method, nonlinear advection-diffusion equation, second-order backward difference formulaKeywords: Semi-Lagrangian method, iteration free method, nonlinear advection-diffusion equation, second-order backward difference formula
Procedia PDF Downloads 321728 Co-Precipitation Method for the Fabrication of Charge-Transfer Molecular Crystal Nanocapsules
Authors: Rabih Al-Kaysi
Abstract:
When quasi-stable solutions of 9-methylanthracene (pi-electron donor, 0.0005 M) and 1,2,4,5-Tetracyanobenzene (pi-electron acceptor, 0.0005 M) in aqueous sodium dodecyl sulfate (SDS, 0.025 M) were gently mixed, uniform-shaped rectangular charge-transfer nanocrystals precipitated out. These red colored charge-transfer (CT) crystals were composed of a 1:1-mole ratio of acceptor/ donor and are highly insoluble in water/SDS solution. The rectangular crystals morphology is semi hollow with symmetrical twin pockets reminiscent of nanocapsules. For a typical crop of nanocapsules, the dimensions are 21 x 6 x 0.5 microns with an approximate hollow volume of 1.5 x 105 nm3. By varying the concentration of aqueous SDS, mixing duration and incubation temperature, we can control the size and volume of the nanocapsules. The initial number of CT seed nanoparticles, formed by mixing the D and A solutions, determined the number and dimensions of the obtained nanocapsules formed after several hours of incubation under still conditions. Prolonged mixing of the donor and acceptor solutions resulted in plenty of initial seeds hence smaller nanocapsules. Short mixing times yields less seed formation and larger micron-sized capsules. The addition of Doxorubicin in situ with the quasi-stable solutions while mixing leads to the formation of CT nanocapsules with Doxorubicin sealed inside. The Doxorubicin can be liberated from the nanocapsules by cracking them using ultrasonication. This method can be extended to other binary CT complex crystals as well.Keywords: charge-transfer, nanocapsules, nanocrystals, doxorubicin
Procedia PDF Downloads 213727 Integrating Inference, Simulation and Deduction in Molecular Domain Analysis and Synthesis with Peculiar Attention to Drug Discovery
Authors: Diego Liberati
Abstract:
Standard molecular modeling is traditionally done through Schroedinger equations via the help of powerful tools helping to manage them atom by atom, often needing High Performance Computing. Here, a full portfolio of new tools, conjugating statistical inference in the so called eXplainable Artificial Intelligence framework (in the form of Machine Learning of understandable rules) to the more traditional modeling and simulation control theory of mixed dynamic logic hybrid processes, is offered as quite a general purpose even if making an example to a popular chemical physics set of problems.Keywords: understandable rules ML, k-means, PCA, PieceWise Affine Auto Regression with eXogenous input
Procedia PDF Downloads 29726 An Inverse Approach for Determining Creep Properties from a Miniature Thin Plate Specimen under Bending
Authors: Yang Zheng, Wei Sun
Abstract:
This paper describes a new approach which can be used to interpret the experimental creep deformation data obtained from miniaturized thin plate bending specimen test to the corresponding uniaxial data based on an inversed application of the reference stress method. The geometry of the thin plate is fully defined by the span of the support, l, the width, b, and the thickness, d. Firstly, analytical solutions for the steady-state, load-line creep deformation rate of the thin plates for a Norton’s power law under plane stress (b → 0) and plane strain (b → ∞) conditions were obtained, from which it can be seen that the load-line deformation rate of the thin plate under plane-stress conditions is much higher than that under the plane-strain conditions. Since analytical solution is not available for the plates with random b-values, finite element (FE) analyses are used to obtain the solutions. Based on the FE results obtained for various b/l ratios and creep exponent, n, as well as the analytical solutions under plane stress and plane strain conditions, an approximate, numerical solutions for the deformation rate are obtained by curve fitting. Using these solutions, a reference stress method is utilised to establish the conversion relationships between the applied load and the equivalent uniaxial stress and between the creep deformations of thin plate and the equivalent uniaxial creep strains. Finally, the accuracy of the empirical solution was assessed by using a set of “theoretical” experimental data.Keywords: bending, creep, thin plate, materials engineering
Procedia PDF Downloads 474725 Efficient Wind Fragility Analysis of Concrete Chimney under Stochastic Extreme Wind Incorporating Temperature Effects
Authors: Soumya Bhattacharjya, Avinandan Sahoo, Gaurav Datta
Abstract:
Wind fragility analysis of chimney is often carried out disregarding temperature effect. However, the combined effect of wind and temperature is the most critical limit state for chimney design. Hence, in the present paper, an efficient fragility analysis for concrete chimney is explored under combined wind and temperature effect. Wind time histories are generated by Davenports Power Spectral Density Function and using Weighed Amplitude Wave Superposition Technique. Fragility analysis is often carried out in full Monte Carlo Simulation framework, which requires extensive computational time. Thus, in the present paper, an efficient adaptive metamodelling technique is adopted to judiciously approximate limit state function, which will be subsequently used in the simulation framework. This will save substantial computational time and make the approach computationally efficient. Uncertainty in wind speed, wind load related parameters, and resistance-related parameters is considered. The results by the full simulation approach, conventional metamodelling approach and proposed adaptive metamodelling approach will be compared. Effect of disregarding temperature in wind fragility analysis will be highlighted.Keywords: adaptive metamodelling technique, concrete chimney, fragility analysis, stochastic extreme wind load, temperature effect
Procedia PDF Downloads 214724 Meat Products Demand in Oyo West Local Government: An Application of Almost Ideal Demand System (LA/AIDS)
Authors: B. A. Adeniyi, S. A. Daud, O. Amao
Abstract:
The study investigates consumer demand for meat products in Oyo West Local Government using linear approximate almost ideal demand system (LA/AIDS). Questions that were addressed by the study include: first, what is the type and quantity of meat products available to the household and their demand pattern? Second is the investigation of the factors that affect meat products demand pattern and proportion of income that is spent on them. For the above purpose cross-sectional data were collected from 156 households of the study area and analyzed to reveal the functional relationship between meat products consumption and some socio-economic variables of the household. Results indicated that per capita meat consumption increased as household income and education increased but decreased with age. It was also found that male tend to consume more meat products than their female counterparts and that increase in household size will first increased per caput meat consumption but later decreased it. Price also tends to greatly influence the demand pattern of meat products. The results of elasticity computed from the results of regression analysis revealed that own price elasticity for all meat products were negative which indicated that they were normal products while cross and expenditure elasticity were positive which further confirmed that meat products were normal and substitute products. This study therefore concludes that the relevance of these variables imposed a great challenge to the policy makers and the government, in the sense that more cost effective methods of meat production technology have to be devised in other to make consumption of meat products more affordable.Keywords: meat products, consumption, animal production, technology
Procedia PDF Downloads 247723 Thermal and Hydraulic Design of Shell and Tube Heat Exchangers
Authors: Ahmed R. Ballil
Abstract:
Heat exchangers are devices used to transfer heat between two fluids. These devices are utilized in many engineering and industrial applications such as heating, cooling, condensation and boiling processes. The fluids might be in direct contact (mixed), or they separated by a solid wall to avoid mixing. In the present paper, interactive computer-aided design of shell and tube heat exchangers is developed using Visual Basic computer code as a framework. This design is based on the Bell-Delaware method, which is one of the very well known methods reported in the literature for the design of shell and tube heat exchangers. Physical properties for either the tube or the shell side fluids are internally evaluated by calling on an enormous data bank composed of more than a hundred fluid compounds. This contributes to increase the accuracy of the present design. The international system of units is considered in the developed computer program. The present design has an added feature of being capable of performing modification based upon a preset design criterion, such that an optimum design is obtained at satisfying constraints set either by the user or by the method itself. Also, the present code is capable of giving an estimate of the approximate cost of the heat exchanger based on the predicted surface area of the exchanger evaluated by the program. Finally, the present thermal and hydraulic design code is tested for accuracy and consistency against some of existed and approved designs of shell and tube heat exchangers.Keywords: bell-delaware method, heat exchangers, shell and tube, thermal and hydraulic design
Procedia PDF Downloads 148722 A Unified Approach for Digital Forensics Analysis
Authors: Ali Alshumrani, Nathan Clarke, Bogdan Ghite, Stavros Shiaeles
Abstract:
Digital forensics has become an essential tool in the investigation of cyber and computer-assisted crime. Arguably, given the prevalence of technology and the subsequent digital footprints that exist, it could have a significant role across almost all crimes. However, the variety of technology platforms (such as computers, mobiles, Closed-Circuit Television (CCTV), Internet of Things (IoT), databases, drones, cloud computing services), heterogeneity and volume of data, forensic tool capability, and the investigative cost make investigations both technically challenging and prohibitively expensive. Forensic tools also tend to be siloed into specific technologies, e.g., File System Forensic Analysis Tools (FS-FAT) and Network Forensic Analysis Tools (N-FAT), and a good deal of data sources has little to no specialist forensic tools. Increasingly it also becomes essential to compare and correlate evidence across data sources and to do so in an efficient and effective manner enabling an investigator to answer high-level questions of the data in a timely manner without having to trawl through data and perform the correlation manually. This paper proposes a Unified Forensic Analysis Tool (U-FAT), which aims to establish a common language for electronic information and permit multi-source forensic analysis. Core to this approach is the identification and development of forensic analyses that automate complex data correlations, enabling investigators to investigate cases more efficiently. The paper presents a systematic analysis of major crime categories and identifies what forensic analyses could be used. For example, in a child abduction, an investigation team might have evidence from a range of sources including computing devices (mobile phone, PC), CCTV (potentially a large number), ISP records, and mobile network cell tower data, in addition to third party databases such as the National Sex Offender registry and tax records, with the desire to auto-correlate and across sources and visualize in a cognitively effective manner. U-FAT provides a holistic, flexible, and extensible approach to providing digital forensics in technology, application, and data-agnostic manner, providing powerful and automated forensic analysis.Keywords: digital forensics, evidence correlation, heterogeneous data, forensics tool
Procedia PDF Downloads 196721 Radionuclide Contents and Exhalation Studies in Soil Samples from Sub-Mountainous Region of Jammu and Kashmir
Authors: Manpreet Kaur
Abstract:
The effect of external and internal exposure in outdoor and indoor environment can be significantly gauged by natural radionuclides. Therefore, it is a consequential to approximate the level of radionuclide contents in soil samples of any area and the risks associated with it. Rate of radon emerging from soil is also one of the prominent parameters for the assessment of radon levels in environmental. In present study, natural radionuclide contents viz. ²³²Th, ²³⁸U and ⁴⁰K and radon/thoron exhalation rates were evaluated operating thallium doped sodium iodide gamma radiation detector and advanced Smart Rn Duo technique in the soil samples from 30 villages of Jammu district, Jammu and Kashmir, India. Radon flux rate was also measured by using surface chamber technique. Results obtained with two different methods were compared to investigate the cause of emanation factor in the soil profile. The radon mass exhalation rate in the soil samples has been found varying from 15 ± 0.4 to 38 ± 0.8 mBq kg⁻¹ h⁻¹ while thoron surface exhalation rate has been found varying from 90 ± 22 to 4880 ± 280 Bq m⁻² h⁻¹. The mean value of radium equivalent activity (99 ± 27 Bq kg⁻¹) was appeared to be well within the admissible limit of 370 Bq kg⁻¹ suggested by Organization for Economic Cooperation and Development (2009) report. The values of various parameters related to radiological hazards were also calculated and all parameters have been found to be well below the safe limits given by various organizations. The outcomes pointed out that region was protected from danger as per health risks effects associated with these radionuclide contents is concerned.Keywords: absorbed dose rate, exhalation rate, human health, radionuclide
Procedia PDF Downloads 136720 Semirings of Graphs: An Approach Towards the Algebra of Graphs
Authors: Gete Umbrey, Saifur Rahman
Abstract:
Graphs are found to be most capable in computing, and its abstract structures have been applied in some specific computations and algorithms like in phase encoding controller, processor microcontroller, and synthesis of a CMOS switching network, etc. Being motivated by these works, we develop an independent approach to study semiring structures and various properties by defining the binary operations which in fact, seems analogous to an existing definition in some sense but with a different approach. This work emphasizes specifically on the construction of semigroup and semiring structures on the set of undirected graphs, and their properties are investigated therein. It is expected that the investigation done here may have some interesting applications in theoretical computer science, networking and decision making, and also on joining of two network systems.Keywords: graphs, join and union of graphs, semiring, weighted graphs
Procedia PDF Downloads 148719 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R
Authors: Jaya Mathew
Abstract:
Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R
Procedia PDF Downloads 379718 Overcoming 4-to-1 Decryption Failure of the Rabin Cryptosystem
Authors: Muhammad Rezal Kamel Ariffin, Muhammad Asyraf Asbullah
Abstract:
The square root modulo problem is a known primitive in designing an asymmetric cryptosystem. It was first attempted by Rabin. Decryption failure of the Rabin cryptosystem caused by the 4-to-1 decryption output is overcome efficiently in this work. The proposed scheme to overcome the decryption failure issue (known as the AAβ-cryptosystem) is constructed using a simple mathematical structure, it has low computational requirements and would enable communication devices with low computing power to deploy secure communication procedures efficiently.Keywords: Rabin cryptosystem, 4-to-1 decryption failure, square root modulo problem, integer factorization problem
Procedia PDF Downloads 475717 The History of Sambipitu Formation Temperature during the Early Miocene Epooch at Kali Ngalang, Nglipar, Gunung Kidul Regency
Authors: R. Harman Dwi, Ryan Avirsa, P. Abraham Ivan
Abstract:
Understanding of temperatures in the past, present, and future temperatures can be possible to do by analysis abundance of fossil foraminifera. This research was conducted in Sambipitu Formation, Ngalang River, Nglipar, Gunung Kidul Regency. The research method is divided into 3 stages: 1) study of literature, research based on previous researchers, 2) spatial, observation and sampling every 5-10 meters, 3) descriptive, analyzing samples consisting of a 10-gram sample weight, washing sample using 30% peroxide, biostratigraphy analysis, paleotemperature analysis using abundance of fossil, diversity analysis using Simpson diversity index method, and comparing current temperature data. There are two phases based on the appearance of Globorotalia menardii and Pulleniatina obliqueculata pointed to Phase Tropical Area, and the appearance of fossil Globigerinoides ruber and Orbulina universa fossil shows the phase of Subtropical Area. Paleotemperatur based on the appearance of Globorotalia menardii, Globigerinoides trilobus, Globigerinoides ruber, Orbulina universa, and Pulleniatina obliqueculata pointed to Warm Water Area and Warm Water Area (average surface water approximate 25°C).Keywords: abundance, biostratigraphy, Simpson diversity index method, paleotemperature
Procedia PDF Downloads 172716 Land Cover Mapping Using Sentinel-2, Landsat-8 Satellite Images, and Google Earth Engine: A Study Case of the Beterou Catchment
Authors: Ella Sèdé Maforikan
Abstract:
Accurate land cover mapping is essential for effective environmental monitoring and natural resources management. This study focuses on assessing the classification performance of two satellite datasets and evaluating the impact of different input feature combinations on classification accuracy in the Beterou catchment, situated in the northern part of Benin. Landsat-8 and Sentinel-2 images from June 1, 2020, to March 31, 2021, were utilized. Employing the Random Forest (RF) algorithm on Google Earth Engine (GEE), a supervised classification categorized the land into five classes: forest, savannas, cropland, settlement, and water bodies. GEE was chosen due to its high-performance computing capabilities, mitigating computational burdens associated with traditional land cover classification methods. By eliminating the need for individual satellite image downloads and providing access to an extensive archive of remote sensing data, GEE facilitated efficient model training on remote sensing data. The study achieved commendable overall accuracy (OA), ranging from 84% to 85%, even without incorporating spectral indices and terrain metrics into the model. Notably, the inclusion of additional input sources, specifically terrain features like slope and elevation, enhanced classification accuracy. The highest accuracy was achieved with Sentinel-2 (OA = 91%, Kappa = 0.88), slightly surpassing Landsat-8 (OA = 90%, Kappa = 0.87). This underscores the significance of combining diverse input sources for optimal accuracy in land cover mapping. The methodology presented herein not only enables the creation of precise, expeditious land cover maps but also demonstrates the prowess of cloud computing through GEE for large-scale land cover mapping with remarkable accuracy. The study emphasizes the synergy of different input sources to achieve superior accuracy. As a future recommendation, the application of Light Detection and Ranging (LiDAR) technology is proposed to enhance vegetation type differentiation in the Beterou catchment. Additionally, a cross-comparison between Sentinel-2 and Landsat-8 for assessing long-term land cover changes is suggested.Keywords: land cover mapping, Google Earth Engine, random forest, Beterou catchment
Procedia PDF Downloads 63715 Description of a Structural Health Monitoring and Control System Using Open Building Information Modeling
Authors: Wahhaj Ahmed Farooqi, Bilal Ahmad, Sandra Maritza Zambrano Bernal
Abstract:
In view of structural engineering, monitoring of structural responses over time is of great importance with respect to recent developments of construction technologies. Recently, developments of advanced computing tools have enabled researcher’s better execution of structural health monitoring (SHM) and control systems. In the last decade, building information modeling (BIM) has substantially enhanced the workflow of planning and operating engineering structures. Typically, building information can be stored and exchanged via model files that are based on the Industry Foundation Classes (IFC) standard. In this study a modeling approach for semantic modeling of SHM and control systems is integrated into the BIM methodology using the IFC standard. For validation of the modeling approach, a laboratory test structure, a four-story shear frame structure, is modeled using a conventional BIM software tool. An IFC schema extension is applied to describe information related to monitoring and control of a prototype SHM and control system installed on the laboratory test structure. The SHM and control system is described by a semantic model applying Unified Modeling Language (UML). Subsequently, the semantic model is mapped into the IFC schema. The test structure is composed of four aluminum slabs and plate-to-column connections are fully fixed. In the center of the top story, semi-active tuned liquid column damper (TLCD) is installed. The TLCD is used to reduce effects of structural responses in context of dynamic vibration and displacement. The wireless prototype SHM and control system is composed of wireless sensor nodes. For testing the SHM and control system, acceleration response is automatically recorded by the sensor nodes equipped with accelerometers and analyzed using embedded computing. As a result, SHM and control systems can be described within open BIM, dynamic responses and information of damages can be stored, documented, and exchanged on the formal basis of the IFC standard.Keywords: structural health monitoring, open building information modeling, industry foundation classes, unified modeling language, semi-active tuned liquid column damper, nondestructive testing
Procedia PDF Downloads 151714 Erectile Dysfunction among Bangladeshi Men with Diabetes
Authors: Shahjada Selim
Abstract:
Background: Erectile dysfunction (ED) is an important impediment to quality of life of men. ED is approximate, three times more common in diabetic than non-diabetic men, and diabetic men develop ED earlier than age-matched non-diabetic subjects. Glycemic control and other factors may contribute in developing and or deteriorating ED. Aim: The aim of the study was to determine the prevalence of ED and its risk factors in type 2 diabetic (T2DM) men in Bangladesh. Methods: During 2013-2014, 3980 diabetic men aged 30-69 years were interviewed at the out-patient departments of seven diabetic centers in Dhaka by using the validated Bengali version of the questionnaire of the International index of erectile function (IIEF) for evaluation of baseline erectile function (EF). The indexes indicate a very high correlation between the items and the questionnaire is consistently reliable. Data were analyzed with Chi-squared (χ²) test using SPSS software. P ≤ 0.05 was considered significant. Results: Out of 3790, ED was found in 2046 (53.98%) of T2DM men. The prevalence of ED was increased with age from 10.5% in men aged 30-39 years to 33.6% in those aged over 60 years (P < 0.001). In comparison with patients with reported diabetes lasting ≤ 5 years (26.4%), the prevalence of ED was less than in those with diabetes of 6-11 years (35.3%) and of 12-30 years (42.5%, P <0.001). ED increased significantly in those who had poor glycemic control. The prevalence of ED in patients with good, fair and poor glycemic control was 22.8%, 42.5% and 47.9% respectively (P = 0.004). Treatment modalities (medical nutrition therapy, oral agents, insulin, and insulin plus oral agents) had significant association with ED and its severity (P < 0.001). Conclusion: Prevalence of ED is very high among T2DM men in Bangladesh and can be reduced the burden by improving glycemic status. Glycemic control, duration of diabetes, treatment modalities, increasing age are associated with ED.Keywords: erectile dysfunction, diabetes, men, Bangladesh
Procedia PDF Downloads 265713 Ray Tracing Modified 3D Image Method Simulation of Picocellular Propagation Channel Environment
Authors: Fathi Alwafie
Abstract:
In this paper we present the simulation of the propagation characteristics of the picocellular propagation channel environment. The first aim has been to find a correct description of the environment for received wave. The result of the first investigations is that the environment of the indoor wave significantly changes as we change the electric parameters of material constructions. A modified 3D ray tracing image method tool has been utilized for the coverage prediction. A detailed analysis of the dependence of the indoor wave on the wide-band characteristics of the channel: Root Mean Square (RMS) delay spread characteristics and mean excess delay, is also investigated.Keywords: propagation, ray tracing, network, mobile computing
Procedia PDF Downloads 400712 A Novel Combination Method for Computing the Importance Map of Image
Authors: Ahmad Absetan, Mahdi Nooshyar
Abstract:
The importance map is an image-based measure and is a core part of the resizing algorithm. Importance measures include image gradients, saliency and entropy, as well as high level cues such as face detectors, motion detectors and more. In this work we proposed a new method to calculate the importance map, the importance map is generated automatically using a novel combination of image edge density and Harel saliency measurement. Experiments of different type images demonstrate that our method effectively detects prominent areas can be used in image resizing applications to aware important areas while preserving image quality.Keywords: content-aware image resizing, visual saliency, edge density, image warping
Procedia PDF Downloads 582711 Chebyshev Wavelets and Applications
Authors: Emanuel Guariglia
Abstract:
In this paper we deal with Chebyshev wavelets. We analyze their properties computing their Fourier transform. Moreover, we discuss the differential properties of Chebyshev wavelets due the connection coefficients. The differential properties of Chebyshev wavelets, expressed by the connection coefficients (also called refinable integrals), are given by finite series in terms of the Kronecker delta. Moreover, we treat the p-order derivative of Chebyshev wavelets and compute its Fourier transform. Finally, we expand the mother wavelet in Taylor series with an application both in fractional calculus and fractal geometry.Keywords: Chebyshev wavelets, Fourier transform, connection coefficients, Taylor series, local fractional derivative, Cantor set
Procedia PDF Downloads 123710 Development of the Web-Based Multimedia N-Screen Service System for Cross Platform
Authors: S. Bae, J. Shin, S. Lee
Abstract:
As the development of smart devices such as Smart TV, Smartphone, Tablet PC, Laptop, the interest in N-Screen Services that can be cross-linked with heterogeneous devices is increasing. N-Screen means User-centric services that can share and constantly watch multimedia contents anytime and anywhere. However, the existing N-Screen system has the limitation that N-Screen system has to implement the application for each platform and device to provide multimedia service. To overcome this limitation, Multimedia N-Screen Service System is proposed through the web, and it is independent of different environments. The combination of Web and cloud computing technologies from this study results in increasing efficiency and reduction in costs.Keywords: N-screen, web, cloud, multimedia
Procedia PDF Downloads 301709 Hybrid Thresholding Lifting Dual Tree Complex Wavelet Transform with Wiener Filter for Quality Assurance of Medical Image
Authors: Hilal Naimi, Amelbahahouda Adamou-Mitiche, Lahcene Mitiche
Abstract:
The main problem in the area of medical imaging has been image denoising. The most defying for image denoising is to secure data carrying structures like surfaces and edges in order to achieve good visual quality. Different algorithms with different denoising performances have been proposed in previous decades. More recently, models focused on deep learning have shown a great promise to outperform all traditional approaches. However, these techniques are limited to the necessity of large sample size training and high computational costs. This research proposes a denoising approach basing on LDTCWT (Lifting Dual Tree Complex Wavelet Transform) using Hybrid Thresholding with Wiener filter to enhance the quality image. This research describes the LDTCWT as a type of lifting wavelets remodeling that produce complex coefficients by employing a dual tree of lifting wavelets filters to get its real part and imaginary part. Permits the remodel to produce approximate shift invariance, directionally selective filters and reduces the computation time (properties lacking within the classical wavelets transform). To develop this approach, a hybrid thresholding function is modeled by integrating the Wiener filter into the thresholding function.Keywords: lifting wavelet transform, image denoising, dual tree complex wavelet transform, wavelet shrinkage, wiener filter
Procedia PDF Downloads 163708 Fixed Point Iteration of a Damped and Unforced Duffing's Equation
Authors: Paschal A. Ochang, Emmanuel C. Oji
Abstract:
The Duffing’s Equation is a second order system that is very important because they are fundamental to the behaviour of higher order systems and they have applications in almost all fields of science and engineering. In the biological area, it is useful in plant stem dependence and natural frequency and model of the Brain Crash Analysis (BCA). In Engineering, it is useful in the study of Damping indoor construction and Traffic lights and to the meteorologist it is used in the prediction of weather conditions. However, most Problems in real life that occur are non-linear in nature and may not have analytical solutions except approximations or simulations, so trying to find an exact explicit solution may in general be complicated and sometimes impossible. Therefore we aim to find out if it is possible to obtain one analytical fixed point to the non-linear ordinary equation using fixed point analytical method. We started by exposing the scope of the Duffing’s equation and other related works on it. With a major focus on the fixed point and fixed point iterative scheme, we tried different iterative schemes on the Duffing’s Equation. We were able to identify that one can only see the fixed points to a Damped Duffing’s Equation and not to the Undamped Duffing’s Equation. This is because the cubic nonlinearity term is the determining factor to the Duffing’s Equation. We finally came to the results where we identified the stability of an equation that is damped, forced and second order in nature. Generally, in this research, we approximate the solution of Duffing’s Equation by converting it to a system of First and Second Order Ordinary Differential Equation and using Fixed Point Iterative approach. This approach shows that for different versions of Duffing’s Equations (damped), we find fixed points, therefore the order of computations and running time of applied software in all fields using the Duffing’s equation will be reduced.Keywords: damping, Duffing's equation, fixed point analysis, second order differential, stability analysis
Procedia PDF Downloads 292707 Using Groundwater Modeling System to Create a 3-D Groundwater Flow and Solute Transport Model for a Semiarid Region: A Case Study of the Nadhour Saouaf Sisseb El Alem Aquifer, Central Tunisia
Authors: Emna Bahri Hammami, Zammouri Mounira, Tarhouni Jamila
Abstract:
The Nadhour Saouaf Sisseb El Alem (NSSA) system comprises some of the most intensively exploited aquifers in central Tunisia. Since the 1970s, the growth in economic productivity linked to intensive agriculture in this semiarid region has been sustained by increasing pumping rates of the system’s groundwater. Exploitation of these aquifers has increased rapidly, ultimately causing their depletion. With the aim to better understand the behavior of the aquifer system and to predict its evolution, the paper presents a finite difference model of the groundwater flow and solute transport. The model is based on the Groundwater Modeling System (GMS) and was calibrated using data from 1970 to 2010. Groundwater levels observed in 1970 were used for the steady-state calibration. Groundwater levels observed from 1971 to 2010 served to calibrate the transient state. The impact of pumping discharge on the evolution of groundwater levels was studied through three hypothetical pumping scenarios. The first two scenarios replicated the approximate drawdown in the aquifer heads (about 17 m in scenario 1 and 23 m in scenario 2 in the center of NSSA) following an increase in pumping rates by 30% and 50% from their current values, respectively. In addition, pumping was stopped in the third scenario, which could increase groundwater reserves by about 7 Mm3/year. NSSA groundwater reserves could be improved considerably if the pumping rules were taken seriously.Keywords: pumping, depletion, groundwater modeling system GMS, Nadhour Saouaf
Procedia PDF Downloads 222706 Geophysical Exploration of Aquifer Zones by (Ves) Method at Ayma-Kharagpur, District Paschim Midnapore, West Bengal
Authors: Mayank Sharma
Abstract:
Groundwater has been a matter of great concern in the past years due to the depletion in the water table. This has resulted from the over-exploitation of groundwater resources. Sub-surface exploration of groundwater is a great way to identify the groundwater potential of an area. Thus, in order to meet the water needs for irrigation in the study area, there was a need for a tube well to be installed. Therefore, a Geophysical investigation was carried out to find the most suitable point of drilling and sinking of tube well that encounters an aquifer. Hence, an electrical resistivity survey of geophysical exploration was used to know the aquifer zones of the area. The Vertical Electrical Sounding (VES) method was employed to know the subsurface geology of the area. Seven vertical electrical soundings using Schlumberger electrode array were carried out, having the maximum AB electrode separation of 700m at selected points in Ayma, Kharagpur-1 block of Paschim Midnapore district, West Bengal. The VES was done using an IGIS DDR3 Resistivity meter up to an approximate depth of 160-180m. The data was interpreted, processed and analyzed. Based on all the interpretations using the direct method, the geology of the area at the points of sounding was interpreted. It was established that two deeper clay-sand sections exist in the area at a depth of 50-70m (having resistivity range of 40-60ohm-m) and 70-160m (having resistivity range of 25-35ohm-m). These aquifers will provide a high yield of water which would be sufficient for the desired irrigation in the study area.Keywords: VES method, Schlumberger method, electrical resistivity survey, geophysical exploration
Procedia PDF Downloads 196705 Instructional Design Strategy Based on Stories with Interactive Resources for Learning English in Preschool
Authors: Vicario Marina, Ruiz Elena, Peredo Ruben, Bustos Eduardo
Abstract:
the development group of Educational Computing of the National Polytechnic (IPN) in Mexico has been developing interactive resources at preschool level in an effort to improve learning in the Child Development Centers (CENDI). This work describes both a didactic architecture and a strategy for teaching English with digital stories using interactive resources available through a Web repository designed to be used in mobile platforms. It will be accessible initially to 500 children and worldwide by the end of 2015.Keywords: instructional design, interactive resources, digital educational resources, story based English teaching, preschool education
Procedia PDF Downloads 472704 Support Vector Regression for Retrieval of Soil Moisture Using Bistatic Scatterometer Data at X-Band
Authors: Dileep Kumar Gupta, Rajendra Prasad, Pradeep Kumar, Varun Narayan Mishra, Ajeet Kumar Vishwakarma, Prashant K. Srivastava
Abstract:
An approach was evaluated for the retrieval of soil moisture of bare soil surface using bistatic scatterometer data in the angular range of 200 to 700 at VV- and HH- polarization. The microwave data was acquired by specially designed X-band (10 GHz) bistatic scatterometer. The linear regression analysis was done between scattering coefficients and soil moisture content to select the suitable incidence angle for retrieval of soil moisture content. The 250 incidence angle was found more suitable. The support vector regression analysis was used to approximate the function described by the input-output relationship between the scattering coefficient and corresponding measured values of the soil moisture content. The performance of support vector regression algorithm was evaluated by comparing the observed and the estimated soil moisture content by statistical performance indices %Bias, root mean squared error (RMSE) and Nash-Sutcliffe Efficiency (NSE). The values of %Bias, root mean squared error (RMSE) and Nash-Sutcliffe Efficiency (NSE) were found 2.9451, 1.0986, and 0.9214, respectively at HH-polarization. At VV- polarization, the values of %Bias, root mean squared error (RMSE) and Nash-Sutcliffe Efficiency (NSE) were found 3.6186, 0.9373, and 0.9428, respectively.Keywords: bistatic scatterometer, soil moisture, support vector regression, RMSE, %Bias, NSE
Procedia PDF Downloads 428703 Exploring the Intersection Between the General Data Protection Regulation and the Artificial Intelligence Act
Authors: Maria Jędrzejczak, Patryk Pieniążek
Abstract:
The European legal reality is on the eve of significant change. In European Union law, there is talk of a “fourth industrial revolution”, which is driven by massive data resources linked to powerful algorithms and powerful computing capacity. The above is closely linked to technological developments in the area of artificial intelligence, which has prompted an analysis covering both the legal environment as well as the economic and social impact, also from an ethical perspective. The discussion on the regulation of artificial intelligence is one of the most serious yet widely held at both European Union and Member State level. The literature expects legal solutions to guarantee security for fundamental rights, including privacy, in artificial intelligence systems. There is no doubt that personal data have been increasingly processed in recent years. It would be impossible for artificial intelligence to function without processing large amounts of data (both personal and non-personal). The main driving force behind the current development of artificial intelligence is advances in computing, but also the increasing availability of data. High-quality data are crucial to the effectiveness of many artificial intelligence systems, particularly when using techniques involving model training. The use of computers and artificial intelligence technology allows for an increase in the speed and efficiency of the actions taken, but also creates security risks for the data processed of an unprecedented magnitude. The proposed regulation in the field of artificial intelligence requires analysis in terms of its impact on the regulation on personal data protection. It is necessary to determine what the mutual relationship between these regulations is and what areas are particularly important in the personal data protection regulation for processing personal data in artificial intelligence systems. The adopted axis of considerations is a preliminary assessment of two issues: 1) what principles of data protection should be applied in particular during processing personal data in artificial intelligence systems, 2) what regulation on liability for personal data breaches is in such systems. The need to change the regulations regarding the rights and obligations of data subjects and entities processing personal data cannot be excluded. It is possible that changes will be required in the provisions regarding the assignment of liability for a breach of personal data protection processed in artificial intelligence systems. The research process in this case concerns the identification of areas in the field of personal data protection that are particularly important (and may require re-regulation) due to the introduction of the proposed legal regulation regarding artificial intelligence. The main question that the authors want to answer is how the European Union regulation against data protection breaches in artificial intelligence systems is shaping up. The answer to this question will include examples to illustrate the practical implications of these legal regulations.Keywords: data protection law, personal data, AI law, personal data breach
Procedia PDF Downloads 65