Search results for: short-time quaternion offset linear canonical transform
2988 Implementation of Data Science in Field of Homologation
Authors: Shubham Bhonde, Nekzad Doctor, Shashwat Gawande
Abstract:
For the use and the import of Keys and ID Transmitter as well as Body Control Modules with radio transmission in a lot of countries, homologation is required. Final deliverables in homologation of the product are certificates. In considering the world of homologation, there are approximately 200 certificates per product, with most of the certificates in local languages. It is challenging to manually investigate each certificate and extract relevant data from the certificate, such as expiry date, approval date, etc. It is most important to get accurate data from the certificate as inaccuracy may lead to missing re-homologation of certificates that will result in an incompliance situation. There is a scope of automation in reading the certificate data in the field of homologation. We are using deep learning as a tool for automation. We have first trained a model using machine learning by providing all country's basic data. We have trained this model only once. We trained the model by feeding pdf and jpg files using the ETL process. Eventually, that trained model will give more accurate results later. As an outcome, we will get the expiry date and approval date of the certificate with a single click. This will eventually help to implement automation features on a broader level in the database where certificates are stored. This automation will help to minimize human error to almost negligible.Keywords: homologation, re-homologation, data science, deep learning, machine learning, ETL (extract transform loading)
Procedia PDF Downloads 1602987 Investigating Methanol Interaction on Hexagonal Ceria-BTC Microrods
Authors: Jamshid Hussain, Kuen Song Lin
Abstract:
For prospective applications, chemists and materials scientists are particularly interested in creating 3D-micro/nanocomposite structures with shapes and unique characteristics. Ceria has recently been produced with a variety of morphologies, including one-dimensional structures (nanoparticles, nanorods, nanowires, and nanotubes). It is anticipated that this material can be used in different fields, such as catalysis, methanol decomposition, carbon monoxide oxidation, optical materials, and environmental protection. Distinct three-dimensional hydrated ceria-BTC (CeO₂-1,3,5-Benzenetricarboxylic-acid) microstructures were successfully synthesized via a hydrothermal route in an aqueous solution. FE-SEM and XRD patterns reveal that a ceria-BTC framework diameter and length are approximately 1.45–2.4 and 5.5–6.5 µm, respectively, at 130 oC and with pH 2 for 72 h. It was demonstrated that the reaction conditions affected the 3D ceria-BTC architecture. The hexagonal ceria-BTC microrod comprises organic linkers, which are transformed into hierarchical ceria microrod in the presences of air at 400 oC was confirmed by Fourier transform infrared spectroscopy. The Ce-O bonding of the hierarchical ceria microrod (HCMs) species has a bond distance and coordination number of 2.44 and 6.89, respectively, which attenuates the EXAFS spectra. Compared to the ceria powder, the HCMs produced more oxygen vacancies and Ce3+ as shown by the XPS and XANES/EXAFS analyses.Keywords: hierarchical ceria microrod, three-dimensional ceria, methanol decomposition, reaction mechanism, XANES/EXAFS
Procedia PDF Downloads 62986 Study of Early Diagnosis of Oral Cancer by Non-invasive Saliva-On-Chip Device: A Microfluidic Approach
Authors: Ragini Verma, J. Ponmozhi
Abstract:
The oral cavity is home to a wide variety of microorganisms that lead to various diseases and even oral cancer. Despite advancements in the diagnosis and detection at the initial phase, the situation hasn’t improved much. Saliva-on-a-chip is an innovative point-of-care platform for early diagnosis of oral cancer and other oral diseases in live and dead cells using a microfluidic device with a current perspective. Some of the major challenges, like real-time imaging of the oral cancer microbes, high throughput values, obtaining a high spatiotemporal resolution, etc. were faced by the scientific community. Integrated microfluidics and microscopy provide powerful approaches to studying the dynamics of oral pathology, microbe interaction, and the oral microenvironment. Here we have developed a saliva-on-chip (salivary microbes) device to monitor the effect on oral cancer. Adhesion of cancer-causing F. nucleatum; subsp. Nucleatum and Prevotella intermedia in the device was observed. We also observed a significant reduction in the oral cancer growth rate when mortality and morbidity were induced. These results show that this approach has the potential to transform the oral cancer and early diagnosis study.Keywords: microfluidic device, oral cancer microbes, early diagnosis, saliva-on-chip
Procedia PDF Downloads 992985 Chlorine Pretreatment Effect on Mechanical Properties of Optical Fiber Glass
Authors: Abhinav Srivastava, Hima Harode, Chandan Kumar Saha
Abstract:
The principal ingredient of an optical fiber is quartz glass. The quality of the optical fiber decreases if impure foreign substances are attached to its preform surface. If residual strain inside a preform is significant, it cracks with a small impact during drawing or transporting. Furthermore, damages and unevenness on the surface of an optical fiber base material break the fiber during drawing. The present work signifies that chlorine pre-treatment enhances mechanical properties of the optical fiber glass. FTIR (Fourier-Transform Infrared Spectroscopy) results show that chlorine gas chemically modifies the structure of silica clad; chlorine is known to soften glass. Metallic impurities on the preform surface likely formed volatile metal chlorides due to chlorine pretreatment at elevated temperature. The chlorine also acts as a drying agent, and therefore the preform surface is anticipated to be water deficient and supposedly avoids particle adhesion on the glass surface. The Weibull analysis of long length tensile strength demarcates a substantial shift in its knee. The higher dynamic fatigue n-value also indicated surface crack healing.Keywords: mechanical strength, optical fiber glass, FTIR, Weibull analysis
Procedia PDF Downloads 1742984 Stability Analysis of Three-Lobe Journal Bearing Lubricated with a Micropolar Fluids
Authors: Boualem Chetti
Abstract:
The dynamic characteristics of a three-lobe journal bearing lubricated with micropolar fluids are determined by the linear stability theory. Lubricating oil containing additives and contaminants is modeled as micropolar fluid. The modified Reynolds equation is obtained using the micropolar lubrication theory and the finite difference technique has been used to solve it. The dynamic characteristics in terms of stiffness, damping coefficients, the critical mass and whirl ratio are determined for various values of size of material characteristic length and the coupling number. The computed results show compared with Newtonian fluids, that micropolar fluid exhibits better stability.Keywords: three-lobe bearings, micropolar fluid, dynamic characteristics, stability analysis
Procedia PDF Downloads 3592983 Nurse-Patient Assignment: Case of Pediatrics Department
Authors: Jihene Jlassi, Ahmed Frikha, Wazna Kortli
Abstract:
The objectives of Nurse-Patient Assignment are the minimization of the overall hospital cost and the maximization of nurses ‘preferences. This paper aims to assess nurses' satisfaction related to the implementation of patient acuity tool-based assignments. So, we used an integer linear program that assigns patients to nurses while balancing nurse workloads. Then, the proposed model is applied to the Paediatrics Department at Kasserine Hospital Tunisia. Where patients need special acuities and high-level nursing skills and care. Hence, numerical results suggested that proposed nurse-patient assignment models can achieve a balanced assignmentKeywords: nurse-patient assignment, mathematical model, logistics, pediatrics department, balanced assignment
Procedia PDF Downloads 1452982 F-VarNet: Fast Variational Network for MRI Reconstruction
Authors: Omer Cahana, Maya Herman, Ofer Levi
Abstract:
Magnetic resonance imaging (MRI) is a long medical scan that stems from a long acquisition time. This length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach, such as compress sensing (CS) or parallel imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. In order to achieve that, two properties have to exist: i) the signal must be sparse under a known transform domain, ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm needs to be applied to recover the signal. While the rapid advance in the deep learning (DL) field, which has demonstrated tremendous successes in various computer vision task’s, the field of MRI reconstruction is still in an early stage. In this paper, we present an extension of the state-of-the-art model in MRI reconstruction -VarNet. We utilize VarNet by using dilated convolution in different scales, which extends the receptive field to capture more contextual information. Moreover, we simplified the sensitivity map estimation (SME), for it holds many unnecessary layers for this task. Those improvements have shown significant decreases in computation costs as well as higher accuracy.Keywords: MRI, deep learning, variational network, computer vision, compress sensing
Procedia PDF Downloads 1592981 Mathematical Model for Defection between Two Political Parties
Authors: Abdullahi Mohammed Auwal
Abstract:
Formation and change or decamping from one political party to another have now become a common trend in Nigeria. Many of the parties’ members who could not secure positions and or win elections in their parties or are not very much satisfied with the trends occurring in the party’s internal democratic principles and mechanisms, change their respective parties. This paper developed/presented and analyzed the used of non linear mathematical model for defections between two political parties using epidemiological approach. The whole population was assumed to be a constant and homogeneously mixed. Equilibria have been analytically obtained and their local and global stability discussed. Conditions for the co-existence of both the political parties have been determined, in the study of defections between People Democratic Party (PDP) and All Progressive Congress (APC) in Nigeria using numerical simulations to support the analytical results.Keywords: model, political parties, deffection, stability, equilibrium, epidemiology
Procedia PDF Downloads 6332980 Physicochemical Attributes of Pectin Hydrogel and Its Wound Healing Activity
Authors: Nor Khaizan Anuar, Nur Karimah Aziz, Tin Wui Wong, Ahmad Sazali Hamzah, Wan Rozita Wan Engah
Abstract:
The physicochemical attributes and wound healing activity of pectin hydrogel in rat models, following partial thickness thermal injury were investigated. The pectin hydrogel was prepared by solvent evaporation method with the aid of glutaraldehyde as crosslinking agent and glycerol as plasticizer. The physicochemical properties were mainly evaluated using differential scanning calorimetry (DSC) and Fourier transform infrared (FTIR) spectroscopy, while the wound healing activity was examined by the macroscopic images, wound size reduction and histological evaluation using haematoxylin and eosin (H&E) stain for 14 days. The DSC and FTIR analysis suggested that pectin hydrogel exhibited higher extent of polymer-polymer interaction at O-H functional group in comparison to the unprocessed pectin. This was indicated by the increase of endothermic enthalpy values from 139.35 ± 13.06 J/g of unprocessed pectin to 156.23 ± 2.86 J/g of pectin hydrogel, as well as the decrease of FTIR wavenumber corresponding to O-H at 3432.07 ± 0.49 cm-1 of unprocessed pectin to 3412.62 ± 13.06 cm-1 of pectin hydrogel. Rats treated with pectin hydrogel had significantly smaller wound size (Student’s t-test, p<0.05) when compared to the untreated group starting from day 7 until day 14. H&E staining indicated that wounds received pectin hydrogel had more fibroblasts, blood vessels and collagen bundles on day 14 in comparison to the untreated rats.Keywords: pectin, physicochemical, rats, wound
Procedia PDF Downloads 3582979 Generator Subgraphs of the Wheel
Authors: Neil M. Mame
Abstract:
We consider only finite graphs without loops nor multiple edges. Let G be a graph with E(G) = {e1, e2, …., em}. The edge space of G, denoted by ε(G), is a vector space over the field Z2. The elements of ε(G) are all the subsets of E(G). Vector addition is defined as X+Y = X Δ Y, the symmetric difference of sets X and Y, for X, Y ∈ ε(G). Scalar multiplication is defined as 1.X =X and 0.X = Ø for X ∈ ε(G). The set S ⊆ ε(G) is called a generating set if every element ε(G) is a linear combination of the elements of S. For a non-empty set X ∈ ε(G), the smallest subgraph with edge set X is called edge-induced subgraph of G, denoted by G[X]. The set EH(G) = { A ∈ ε(G) : G[A] ≅ H } denotes the uniform set of H with respect to G and εH(G) denotes the subspace of ε(G) generated by EH(G). If εH(G) is generating set, then we call H a generator subgraph of G. This paper gives the characterization for the generator subgraphs of the wheel that contain cycles and gives the necessary conditions for the acyclic generator subgraphs of the wheel.Keywords: edge space, edge-induced subgraph, generator subgraph, wheel
Procedia PDF Downloads 4632978 Adaptive Motion Planning for 6-DOF Robots Based on Trigonometric Functions
Authors: Jincan Li, Mingyu Gao, Zhiwei He, Yuxiang Yang, Zhongfei Yu, Yuanyuan Liu
Abstract:
Building an appropriate motion model is crucial for trajectory planning of robots and determines the operational quality directly. An adaptive acceleration and deceleration motion planning based on trigonometric functions for the end-effector of 6-DOF robots in Cartesian coordinate system is proposed in this paper. This method not only achieves the smooth translation motion and rotation motion by constructing a continuous jerk model, but also automatically adjusts the parameters of trigonometric functions according to the variable inputs and the kinematic constraints. The results of computer simulation show that this method is correct and effective to achieve the adaptive motion planning for linear trajectories.Keywords: kinematic constraints, motion planning, trigonometric function, 6-DOF robots
Procedia PDF Downloads 2702977 Transverse Vibration of Non-Homogeneous Rectangular Plates of Variable Thickness Using GDQ
Abstract:
The effect of non-homogeneity on the free transverse vibration of thin rectangular plates of bilinearly varying thickness has been analyzed using generalized differential quadrature (GDQ) method. The non-homogeneity of the plate material is assumed to arise due to linear variations in Young’s modulus and density of the plate material with the in-plane coordinates x and y. Numerical results have been computed for fully clamped and fully simply supported boundary conditions. The solution procedure by means of GDQ method has been implemented in a MATLAB code. The effect of various plate parameters has been investigated for the first three modes of vibration. A comparison of results with those available in literature has been presented.Keywords: rectangular, non-homogeneous, bilinear thickness, generalized differential quadrature (GDQ)
Procedia PDF Downloads 3822976 Transient Analysis of Central Region Void Fraction in a 3x3 Rod Bundle under Bubbly and Cap/Slug Flows
Authors: Ya-Chi Yu, Pei-Syuan Ruan, Shao-Wen Chen, Yu-Hsien Chang, Jin-Der Lee, Jong-Rong Wang, Chunkuan Shih
Abstract:
This study analyzed the transient signals of central region void fraction of air-water two-phase flow in a 3x3 rod bundle. Experimental tests were carried out utilizing a vertical rod bundle test section along with a set of air-water supply/flow control system, and the transient signals of the central region void fraction were collected through the electrical conductivity sensors as well as visualized via high speed photography. By converting the electric signals, transient void fraction can be obtained through the voltage ratios. With a fixed superficial water velocity (Jf=0.094 m/s), two different superficial air velocities (Jg=0.094 m/s and 0.236 m/s) were tested and presented, which were corresponding to the flow conditions of bubbly flows and cap/slug flows, respectively. The time averaged central region void fraction was obtained as 0.109-0.122 with 0.028 standard deviation for the selected bubbly flow and 0.188-0.221with 0.101 standard deviation for the selected cap/slug flow, respectively. Through Fast Fourier Transform (FFT) analysis, no clear frequency peak was found in bubbly flow, while two dominant frequencies were identified around 1.6 Hz and 2.5 Hz in the present cap/slug flow.Keywords: central region, rod bundles, transient void fraction, two-phase flow
Procedia PDF Downloads 1832975 Sampling Effects on Secondary Voltage Control of Microgrids Based on Network of Multiagent
Authors: M. J. Park, S. H. Lee, C. H. Lee, O. M. Kwon
Abstract:
This paper studies a secondary voltage control framework of the microgrids based on the consensus for a communication network of multiagent. The proposed control is designed by the communication network with one-way links. The communication network is modeled by a directed graph. At this time, the concept of sampling is considered as the communication constraint among each distributed generator in the microgrids. To analyze the sampling effects on the secondary voltage control of the microgrids, by using Lyapunov theory and some mathematical techniques, the sufficient condition for such problem will be established regarding linear matrix inequality (LMI). Finally, some simulation results are given to illustrate the necessity of the consideration of the sampling effects on the secondary voltage control of the microgrids.Keywords: microgrids, secondary control, multiagent, sampling, LMI
Procedia PDF Downloads 3312974 Electrochemical Response Transductions of Graphenated-Polyaniline Nanosensor for Environmental Anthracene
Authors: O. Tovide, N. Jahed, N. Mohammed, C. E. Sunday, H. R. Makelane, R. F. Ajayi, K. M. Molapo, A. Tsegaye, M. Masikini, S. Mailu, A. Baleg, T. Waryo, P. G. Baker, E. I. Iwuoha
Abstract:
A graphenated–polyaniline (GR-PANI) nanocomposite sensor was constructed and used for the determination of anthracene. The direct electro-oxidation behavior of anthracene on the GR-PANI modified glassy carbon electrode (GCE) was used as the sensing principle. The results indicate thatthe response profile of the oxidation of anthracene on GR-PANI-modified GCE provides for the construction of sensor systems based onamperometric and potentiometric signal transductions. A dynamic linear range of 0.12- 100 µM anthracene and a detection limit of 0.044 µM anthracene were established for the sensor system.Keywords: electrochemical sensors, environmental pollutants, graphenated-polymers, polyaromatic hydrocarbon
Procedia PDF Downloads 3542973 Novel Numerical Technique for Dusty Plasma Dynamics (Yukawa Liquids): Microfluidic and Role of Heat Transport
Authors: Aamir Shahzad, Mao-Gang He
Abstract:
Currently, dusty plasmas motivated the researchers' widespread interest. Since the last two decades, substantial efforts have been made by the scientific and technological community to investigate the transport properties and their nonlinear behavior of three-dimensional and two-dimensional nonideal complex (dusty plasma) liquids (NICDPLs). Different calculations have been made to sustain and utilize strongly coupled NICDPLs because of their remarkable scientific and industrial applications. Understanding of the thermophysical properties of complex liquids under various conditions is of practical interest in the field of science and technology. The determination of thermal conductivity is also a demanding question for thermophysical researchers, due to some reasons; very few results are offered for this significant property. Lack of information of the thermal conductivity of dense and complex liquids at different parameters related to the industrial developments is a major barrier to quantitative knowledge of the heat flux flow from one medium to another medium or surface. The exact numerical investigation of transport properties of complex liquids is a fundamental research task in the field of thermophysics, as various transport data are closely related with the setup and confirmation of equations of state. A reliable knowledge of transport data is also important for an optimized design of processes and apparatus in various engineering and science fields (thermoelectric devices), and, in particular, the provision of precise data for the parameters of heat, mass, and momentum transport is required. One of the promising computational techniques, the homogenous nonequilibrium molecular dynamics (HNEMD) simulation, is over viewed with a special importance on the application to transport problems of complex liquids. This proposed work is particularly motivated by the FIRST TIME to modify the problem of heat conduction equations leads to polynomial velocity and temperature profiles algorithm for the investigation of transport properties with their nonlinear behaviors in the NICDPLs. The aim of proposed work is to implement a NEMDS algorithm (Poiseuille flow) and to delve the understanding of thermal conductivity behaviors in Yukawa liquids. The Yukawa system is equilibrated through the Gaussian thermostat in order to maintain the constant system temperature (canonical ensemble ≡ NVT)). The output steps will be developed between 3.0×105/ωp and 1.5×105/ωp simulation time steps for the computation of λ data. The HNEMD algorithm shows that the thermal conductivity is dependent on plasma parameters and the minimum value of lmin shifts toward higher G with an increase in k, as expected. New investigations give more reliable simulated data for the plasma conductivity than earlier known simulation data and generally the plasma λ0 by 2%-20%, depending on Γ and κ. It has been shown that the obtained results at normalized force field are in satisfactory agreement with various earlier simulation results. This algorithm shows that the new technique provides more accurate results with fast convergence and small size effects over a wide range of plasma states.Keywords: molecular dynamics simulation, thermal conductivity, nonideal complex plasma, Poiseuille flow
Procedia PDF Downloads 2732972 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards
Authors: Golnush Masghati-Amoli, Paul Chin
Abstract:
Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering
Procedia PDF Downloads 1322971 Analysing the Cost of Immigrants to the National Health System in Eastern Macedonia and Thrace
Authors: T. Theodosiou, P. Polychronidou, A. G. Karasavvoglou
Abstract:
The latest years the number of immigrants at Greece has increased dramatically. Their impact on the National Health System (NHS) has not been yet thoroughly investigated. This paper analyses the cost of immigrants to the NHS hospitals of the region of Eastern Macedonia and Thrace. The data are collected from 2005 to 2011 from five different hospitals and are analysed using linear mixed effects models in order to investigate the effects of nationality and year on the cost of hospitalization and treatment. The results show that generally the Greek nationality patients have a higher mean cost of hospitalization compared to the immigrants and that there is an increasing trend for the cost except for the year 2010.Keywords: cost, Eastern Macedonia and Thrace, immigrants, national health system
Procedia PDF Downloads 2452970 Conjugate Mixed Convection Heat Transfer and Entropy Generation of Cu-Water Nanofluid in an Enclosure with Thick Wavy Bottom Wall
Authors: Sanjib Kr Pal, S. Bhattacharyya
Abstract:
Mixed convection of Cu-water nanofluid in an enclosure with thick wavy bottom wall has been investigated numerically. A co-ordinate transformation method is used to transform the computational domain into an orthogonal co-ordinate system. The governing equations in the computational domain are solved through a pressure correction based iterative algorithm. The fluid flow and heat transfer characteristics are analyzed for a wide range of Richardson number (0.1 ≤ Ri ≤ 5), nanoparticle volume concentration (0.0 ≤ ϕ ≤ 0.2), amplitude (0.0 ≤ α ≤ 0.1) of the wavy thick- bottom wall and the wave number (ω) at a fixed Reynolds number. Obtained results showed that heat transfer rate increases remarkably by adding the nanoparticles. Heat transfer rate is dependent on the wavy wall amplitude and wave number and decreases with increasing Richardson number for fixed amplitude and wave number. The Bejan number and the entropy generation are determined to analyze the thermodynamic optimization of the mixed convection.Keywords: conjugate heat transfer, mixed convection, nano fluid, wall waviness
Procedia PDF Downloads 2532969 Real Estate Trend Prediction with Artificial Intelligence Techniques
Authors: Sophia Liang Zhou
Abstract:
For investors, businesses, consumers, and governments, an accurate assessment of future housing prices is crucial to critical decisions in resource allocation, policy formation, and investment strategies. Previous studies are contradictory about macroeconomic determinants of housing price and largely focused on one or two areas using point prediction. This study aims to develop data-driven models to accurately predict future housing market trends in different markets. This work studied five different metropolitan areas representing different market trends and compared three-time lagging situations: no lag, 6-month lag, and 12-month lag. Linear regression (LR), random forest (RF), and artificial neural network (ANN) were employed to model the real estate price using datasets with S&P/Case-Shiller home price index and 12 demographic and macroeconomic features, such as gross domestic product (GDP), resident population, personal income, etc. in five metropolitan areas: Boston, Dallas, New York, Chicago, and San Francisco. The data from March 2005 to December 2018 were collected from the Federal Reserve Bank, FBI, and Freddie Mac. In the original data, some factors are monthly, some quarterly, and some yearly. Thus, two methods to compensate missing values, backfill or interpolation, were compared. The models were evaluated by accuracy, mean absolute error, and root mean square error. The LR and ANN models outperformed the RF model due to RF’s inherent limitations. Both ANN and LR methods generated predictive models with high accuracy ( > 95%). It was found that personal income, GDP, population, and measures of debt consistently appeared as the most important factors. It also showed that technique to compensate missing values in the dataset and implementation of time lag can have a significant influence on the model performance and require further investigation. The best performing models varied for each area, but the backfilled 12-month lag LR models and the interpolated no lag ANN models showed the best stable performance overall, with accuracies > 95% for each city. This study reveals the influence of input variables in different markets. It also provides evidence to support future studies to identify the optimal time lag and data imputing methods for establishing accurate predictive models.Keywords: linear regression, random forest, artificial neural network, real estate price prediction
Procedia PDF Downloads 1022968 Synthesis and Characterization of Partially Oxidized Graphite Oxide for Solar Energy Storage Applications
Authors: Ghada Ben Hamad, Zohir Younsi, Fabien Salaun, Hassane Naji, Noureddine Lebaz
Abstract:
The graphene oxide (GO) material has attracted much attention for solar energy applications. This paper reports the synthesis and characterization of partially oxidized graphite oxide (GTO). GTO was obtained by modified Hummers method, which is based on the chemical oxidation of natural graphite. Several samples were prepared with different oxidation degree by an adjustment of the oxidizing agent’s amount. The effect of the oxidation degree on the chemical structure and on the morphology of GTO was determined by using Fourier transform infrared (FT-IR) spectroscopy, Energy Dispersive X-ray Spectroscopy (EDS), and scanning electronic microscope (SEM). The thermal stability of GTO was evaluated by using thermogravimetric analyzer (TGA) in Nitrogen atmosphere. The results indicate high degree oxidation of graphite oxide for each sample, proving that the process is efficient. The GTO synthesized by modified Hummers method shows promising characteristics. Graphene oxide (GO) obtained by exfoliation of GTO are recognized as a good candidate for thermal energy storage, and it will be used as solid shell material in the encapsulation of phase change materials (PCM).Keywords: modified hummers method, graphite oxide, oxidation degree, solar energy storage
Procedia PDF Downloads 1172967 Tokenization of Blue Bonds to Scale Blue Carbon Projects
Authors: Rodrigo Buaiz Boabaid
Abstract:
Tokenization of Blue Bonds is an emerging Green Finance tool that has the potential to scale Blue Carbon Projects to fight climate change. This innovative solution has a huge potential to democratize the green finance market and catalyze innovations in the climate change finance sector. Switzerland has emerged as a leader in the Green Finance space and is well-positioned to drive the adoption of Tokenization of Blue & Green Bonds. This unique approach has the potential to unlock new sources of capital and enable global investors to participate in the financing of sustainable blue carbon projects. By leveraging the power of blockchain technology, Tokenization of Blue Bonds can provide greater transparency, efficiency, and security in the investment process while also reducing transaction costs. Investments are in line with the highest regulations and designed according to the stringent legal framework and compliance standards set by Switzerland. The potential benefits of Tokenization of Blue Bonds are significant and could transform the way that sustainable projects are financed. By unlocking new sources of capital, this approach has the potential to accelerate the deployment of Blue Carbon projects and create new opportunities for investors to participate in the fight against climate change.Keywords: blue bonds, blue carbon, tokenization, green finance
Procedia PDF Downloads 862966 An Evaluation of the Impact of Epoxidized Neem Seed Azadirachta indica Oil on the Mechanical Properties of Polystyrene
Authors: Salihu Takuma
Abstract:
Neem seed oil has high contents of unsaturated fatty acids which can be converted to epoxy fatty acids. The vegetable oil – based epoxy material are sustainable, renewable and biodegradable materials replacing petrochemical – based epoxy materials in some applications. Polystyrene is highly brittle with limited mechanical applications. Raw neem seed oil was obtained from National Research Institute for Chemical Technology (NARICT), Zaria, Nigeria. The oil was epoxidized at 60 0C for three (3) hours using formic acid generated in situ. The epoxidized oil was characterized using Fourier Transform Infrared spectroscopy (FTIR). The disappearance of C = C stretching peak around 3011.7 cm-1and formation of a new absorption peak around 943 cm-1 indicate the success of epoxidation. The epoxidized oil was blended with pure polystyrene in different weight percent compositions using solution casting in chloroform. The tensile properties of the blends demonstrated that the addition of 5 wt % ENO to PS led to an increase in elongation at break, but a decrease in tensile strength and modulus. This is in accordance with the common rule that plasticizers can decrease the tensile strength of the polymer.Keywords: biodegradable, elongation at break, epoxidation, epoxy fatty acids, sustainable, tensile strength and modulus
Procedia PDF Downloads 2322965 The Determinants of Financing to Deposit Ratio of Islamic Bank in Malaysia
Authors: Achsania Hendratmi, Puji Sucia Sukmaningrum, Fatin Fadhilah Hasib, Nisful Laila
Abstract:
The research aimed to know the influence of Capital Adequacy Ratio (CAR), Return on Assets (ROA) and Size of the Financing to Deposit Ratio (FDR) Islamic Banks in Malaysia by using eleven Islamic Banks in Indonesia and fifteen Islamic Banks in Malaysia in the period 2012 to 2016 as samples. The research used a quantitative approach method, and the analysis technique used multiple linear regression. Based on the result of t-test (partial), CAR, ROA and size significantly affect of FDR. While the results of f-test (simultaneous) showed that CAR, ROA and Size significant effect on FDR.Keywords: capital adequacy ratio, financing to deposit ratio, return on assets, size
Procedia PDF Downloads 3372964 Identification of EEG Attention Level Using Empirical Mode Decompositions for BCI Applications
Authors: Chia-Ju Peng, Shih-Jui Chen
Abstract:
This paper proposes a method to discriminate electroencephalogram (EEG) signals between different concentration states using empirical mode decomposition (EMD). Brain-computer interface (BCI), also called brain-machine interface, is a direct communication pathway between the brain and an external device without the inherent pathway such as the peripheral nervous system or skeletal muscles. Attention level is a common index as a control signal of BCI systems. The EEG signals acquired from people paying attention or in relaxation, respectively, are decomposed into a set of intrinsic mode functions (IMF) by EMD. Fast Fourier transform (FFT) analysis is then applied to each IMF to obtain the frequency spectrums. By observing power spectrums of IMFs, the proposed method has the better identification of EEG attention level than the original EEG signals between different concentration states. The band power of IMF3 is the most obvious especially in β wave, which corresponds to fully awake and generally alert. The signal processing method and results of this experiment paves a new way for BCI robotic system using the attention-level control strategy. The integrated signal processing method reveals appropriate information for discrimination of the attention and relaxation, contributing to a more enhanced BCI performance.Keywords: biomedical engineering, brain computer interface, electroencephalography, rehabilitation
Procedia PDF Downloads 3892963 Micromechanism of Ionization Effects on Metal/Gas Mixing Instabilty at Extreme Shock Compressing Conditions
Authors: Shenghong Huang, Weirong Wang, Xisheng Luo, Xinzhu Li, Xinwen Zhao
Abstract:
Understanding of material mixing induced by Richtmyer-Meshkov instability (RMI) at extreme shock compressing conditions (high energy density environment: P >> 100GPa, T >> 10000k) is of great significance in engineering and science, such as inertial confinement fusion(ICF), supersonic combustion, etc. Turbulent mixing induced by RMI is a kind of complex fluid dynamics, which is closely related with hydrodynamic conditions, thermodynamic states, material physical properties such as compressibility, strength, surface tension and viscosity, etc. as well as initial perturbation on interface. For phenomena in ordinary thermodynamic conditions (low energy density environment), many investigations have been conducted and many progresses have been reported, while for mixing in extreme thermodynamic conditions, the evolution may be very different due to ionization as well as large difference of material physical properties, which is full of scientific problems and academic interests. In this investigation, the first principle based molecular dynamic method is applied to study metal Lithium and gas Hydrogen (Li-H2) interface mixing in micro/meso scale regime at different shock compressing loading speed ranging from 3 km/s to 30 km/s. It's found that, 1) Different from low-speed shock compressing cases, in high-speed shock compresing (>9km/s) cases, a strong acceleration of metal/gas interface after strong shock compression is observed numerically, leading to a strong phase inverse and spike growing with a relative larger linear rate. And more specially, the spike growing rate is observed to be increased with shock loading speed, presenting large discrepancy with available empirical RMI models; 2) Ionization is happened in shock font zone at high-speed loading cases(>9km/s). An additional local electric field induced by the inhomogeneous diffusion of electrons and nuclei after shock font is observed to occur near the metal/gas interface, leading to a large acceleration of nuclei in this zone; 3) In conclusion, the work of additional electric field contributes to a mechanism of RMI in micro/meso scale regime at extreme shock compressing conditions, i.e., a Rayleigh-Taylor instability(RTI) is induced by additional electric field during RMI mixing process and thus a larger linear growing rate of interface spike.Keywords: ionization, micro/meso scale, material mixing, shock
Procedia PDF Downloads 2242962 Study of Thermal and Mechanical Properties of Ethylene/1-Octene Copolymer Based Nanocomposites
Authors: Sharmila Pradhan, Ralf Lach, George Michler, Jean Mark Saiter, Rameshwar Adhikari
Abstract:
Ethylene/1-octene copolymer was modified incorporating three types of nanofillers differed in their dimensionality in order to investigate the effect of filler dimensionality on mechanical properties, for instance, tensile strength, microhardness etc. The samples were prepared by melt mixing followed by compression moldings. The microstructure of the novel material was characterized by Fourier transform infrared spectroscopy (FTIR), X-ray diffraction (XRD) method and Transmission electron microscopy (TEM). Other important properties such as melting, crystallizing and thermal stability were also investigated via differential scanning calorimetry (DSC) and Thermogravimetry analysis (TGA). The FTIR and XRD results showed that the composites were formed by physical mixing. The TEM result supported the homogeneous dispersion of nanofillers in the matrix. The mechanical characterization performed by tensile testing showed that the composites with 1D nanofiller effectively reinforced the polymer. TGA results revealed that the thermal stability of pure EOC is marginally improved by the addition of nanofillers. Likewise, melting and crystallizing properties of the composites are not much different from that of pure.Keywords: copolymer, differential scanning calorimetry, nanofiller, tensile strength
Procedia PDF Downloads 2462961 Analysis of Detection Concealed Objects Based on Multispectral and Hyperspectral Signatures
Authors: M. Kastek, M. Kowalski, M. Szustakowski, H. Polakowski, T. Sosnowski
Abstract:
Development of highly efficient security systems is one of the most urgent topics for science and engineering. There are many kinds of threats and many methods of prevention. It is very important to detect a threat as early as possible in order to neutralize it. One of the very challenging problems is detection of dangerous objects hidden under human’s clothing. This problem is particularly important for safety of airport passengers. In order to develop methods and algorithms to detect hidden objects it is necessary to determine the thermal signatures of such objects of interest. The laboratory measurements were conducted to determine the thermal signatures of dangerous tools hidden under various clothes in different ambient conditions. Cameras used for measurements were working in spectral range 0.6-12.5 μm An infrared imaging Fourier transform spectroradiometer was also used, working in spectral range 7.7-11.7 μm. Analysis of registered thermograms and hyperspectral datacubes has yielded the thermal signatures for two types of guns, two types of knives and home-made explosive bombs. The determined thermal signatures will be used in the development of method and algorithms of image analysis implemented in proposed monitoring systems.Keywords: hyperspectral detection, nultispectral detection, image processing, monitoring systems
Procedia PDF Downloads 3472960 Clustering Categorical Data Using the K-Means Algorithm and the Attribute’s Relative Frequency
Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami
Abstract:
Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.Keywords: clustering, unsupervised learning, pattern recognition, categorical datasets, knowledge discovery, k-means
Procedia PDF Downloads 2582959 Inconsistent Effects of Landscape Heterogeneity on Animal Diversity in an Agricultural Mosaic: A Multi-Scale and Multi-Taxon Investigation
Authors: Chevonne Reynolds, Robert J. Fletcher, Jr, Celine M. Carneiro, Nicole Jennings, Alison Ke, Michael C. LaScaleia, Mbhekeni B. Lukhele, Mnqobi L. Mamba, Muzi D. Sibiya, James D. Austin, Cebisile N. Magagula, Themba’alilahlwa Mahlaba, Ara Monadjem, Samantha M. Wisely, Robert A. McCleery
Abstract:
A key challenge for the developing world is reconciling biodiversity conservation with the growing demand for food. In these regions, agriculture is typically interspersed among other land-uses creating heterogeneous landscapes. A primary hypothesis for promoting biodiversity in agricultural landscapes is the habitat heterogeneity hypothesis. While there is evidence that landscape heterogeneity positively influences biodiversity, the application of this hypothesis is hindered by a need to determine which components of landscape heterogeneity drive these effects and at what spatial scale(s). Additionally, whether diverse taxonomic groups are similarly affected is central for determining the applicability of this hypothesis as a general conservation strategy in agricultural mosaics. Two major components of landscape heterogeneity are compositional and configurational heterogeneity. Disentangling the roles of each component is important for biodiversity conservation because each represents different mechanisms underpinning variation in biodiversity. We identified a priori independent gradients of compositional and configurational landscape heterogeneity within an extensive agricultural mosaic in north-eastern Swaziland. We then tested how bird, dung beetle, ant and meso-carnivore diversity responded to compositional and configurational heterogeneity across six different spatial scales. To determine if a general trend could be observed across multiple taxa, we also tested which component and spatial scale was most influential across all taxonomic groups combined, Compositional, not configurational, heterogeneity explained diversity in each taxonomic group, with the exception of meso-carnivores. Bird and ant diversity was positively correlated with compositional heterogeneity at fine spatial scales < 1000 m, whilst dung beetle diversity was negatively correlated to compositional heterogeneity at broader spatial scales > 1500 m. Importantly, because of these contrasting effects across taxa, there was no effect of either component of heterogeneity on the combined taxonomic diversity at any spatial scale. The contrasting responses across taxonomic groups exemplify the difficulty in implementing effective conservation strategies that meet the requirements of diverse taxa. To promote diverse communities across a range of taxa, conservation strategies must be multi-scaled and may involve different strategies at varying scales to offset the contrasting influences of compositional heterogeneity. A diversity of strategies are likely key to conserving biodiversity in agricultural mosaics, and we have demonstrated that a landscape management strategy that only manages for heterogeneity at one particular scale will likely fall short of management objectives.Keywords: agriculture, biodiversity, composition, configuration, heterogeneity
Procedia PDF Downloads 260