Search results for: matrix minimization algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5768

Search results for: matrix minimization algorithm

1598 Evaluation of Gene Expression after in Vitro Differentiation of Human Bone Marrow-Derived Stem Cells to Insulin-Producing Cells

Authors: Mahmoud M. Zakaria, Omnia F. Elmoursi, Mahmoud M. Gabr, Camelia A. AbdelMalak, Mohamed A. Ghoneim

Abstract:

Many protocols were publicized for differentiation of human mesenchymal stem cells (MSCS) into insulin-producing cells (IPCs) in order to excrete insulin hormone ingoing to treat diabetes disease. Our aim is to evaluate relative gene expression for each independent protocol. Human bone marrow cells were derived from three volunteers that suffer diabetes disease. After expansion of mesenchymal stem cells, differentiation of these cells was done by three different protocols (the one-step protocol was used conophylline protein, the two steps protocol was depending on trichostatin-A, and the three-step protocol was started by beta-mercaptoethanol). Evaluation of gene expression was carried out by real-time PCR: Pancreatic endocrine genes, transcription factors, glucose transporter, precursor markers, pancreatic enzymes, proteolytic cleavage, extracellular matrix and cell surface protein. Quantitation of insulin secretion was detected by immunofluorescence technique in 24-well plate. Most of the genes studied were up-regulated in the in vitro differentiated cells, and also insulin production was observed in the three independent protocols. There were some slight increases in expression of endocrine mRNA of two-step protocol and its insulin production. So, the two-step protocol was showed a more efficient in expressing of pancreatic endocrine genes and its insulin production than the other two protocols.

Keywords: mesenchymal stem cells, insulin producing cells, conophylline protein, trichostatin-A, beta-mercaptoethanol, gene expression, immunofluorescence technique

Procedia PDF Downloads 195
1597 Point-of-Interest Recommender Systems for Location-Based Social Network Services

Authors: Hoyeon Park, Yunhwan Keon, Kyoung-Jae Kim

Abstract:

Location Based Social Network services (LBSNs) is a new term that combines location based service and social network service (SNS). Unlike traditional SNS, LBSNs emphasizes empirical elements in the user's actual physical location. Point-of-Interest (POI) is the most important factor to implement LBSNs recommendation system. POI information is the most popular spot in the area. In this study, we would like to recommend POI to users in a specific area through recommendation system using collaborative filtering. The process is as follows: first, we will use different data sets based on Seoul and New York to find interesting results on human behavior. Secondly, based on the location-based activity information obtained from the personalized LBSNs, we have devised a new rating that defines the user's preference for the area. Finally, we have developed an automated rating algorithm from massive raw data using distributed systems to reduce advertising costs of LBSNs.

Keywords: location-based social network services, point-of-interest, recommender systems, business analytics

Procedia PDF Downloads 215
1596 SA-SPKC: Secure and Efficient Aggregation Scheme for Wireless Sensor Networks Using Stateful Public Key Cryptography

Authors: Merad Boudia Omar Rafik, Feham Mohammed

Abstract:

Data aggregation in wireless sensor networks (WSNs) provides a great reduction of energy consumption. The limited resources of sensor nodes make the choice of an encryption algorithm very important for providing security for data aggregation. Asymmetric cryptography involves large ciphertexts and heavy computations but solves, on the other hand, the problem of key distribution of symmetric one. The latter provides smaller ciphertexts and speed computations. Also, the recent researches have shown that achieving the end-to-end confidentiality and the end-to-end integrity at the same is a challenging task. In this paper, we propose (SA-SPKC), a novel security protocol which addresses both security services for WSNs, and where only the base station can verify the individual data and identify the malicious node. Our scheme is based on stateful public key encryption (StPKE). The latter combines the best features of both kinds of encryption along with state in order to reduce the computation overhead. Our analysis

Keywords: secure data aggregation, wireless sensor networks, elliptic curve cryptography, homomorphic encryption

Procedia PDF Downloads 275
1595 Prioritizing the Most Important Information from Contractors’ BIM Handover for Firefighters’ Responsibilities

Authors: Akram Mahdaviparsa, Tamera McCuen, Vahideh Karimimansoob

Abstract:

Fire service is responsible for protecting life, assets, and natural resources from fire and other hazardous incidents. Search and rescue in unfamiliar buildings is a vital part of firefighters’ responsibilities. Providing firefighters with precise building information in an easy-to-understand format is a potential solution for mitigating the negative consequences of fire hazards. The negative effect of insufficient knowledge about a building’s indoor environment impedes firefighters’ capabilities and leads to lost property. A data rich building information modeling (BIM) is a potentially useful source in three-dimensional (3D) visualization and data/information storage for fire emergency response. Therefore, this research’s purpose is prioritizing the required information for firefighters from the most important information to the least important. A survey was carried out with firefighters working in the Norman Fire Department to obtain the importance of each building information item. The results show that “the location of exit doors, windows, corridors, elevators, and stairs”, “material of building elements”, and “building data” are the three most important information specified by firefighters. The results also implied that the 2D model of architectural, structural and way finding is more understandable in comparison with the 3D model, while the 3D model of MEP system could convey more information than the 2D model. Furthermore, color in visualization can help firefighters to understand the building information easier and quicker. Sufficient internal consistency of all responses was proven through developing the Pearson Correlation Matrix and obtaining Cronbach’s alpha of 0.916. Therefore, the results of this study are reliable and could be applied to the population.

Keywords: BIM, building fire response, ranking, visualization

Procedia PDF Downloads 119
1594 Design and Advancement of Hybrid Multilevel Inverter Interface with PhotoVoltaic

Authors: P.Kiruthika, K. Ramani

Abstract:

This paper presented the design and advancement of a single-phase 27-level Hybrid Multilevel DC-AC Converter interfacing with Photo Voltaic. In this context, the Multicarrier Pulse Width Modulation method can be implemented in 27-level Hybrid Multilevel Inverter for generating a switching pulse. Perturb & Observer algorithm can be used in the Maximum Power Point Tracking method for the Photo Voltaic system. By implementing Maximum Power Point Tracking with three separate solar panels as an input source to the 27-level Hybrid Multilevel Inverter. This proposed method can be simulated by using MATLAB/simulink. The result shown that the proposed method can achieve silky output wave forms, more flexibility in voltage range, and to reduce Total Harmonic Distortion in medium-voltage drives.

Keywords: Multi Carrier Pulse Width Modulation Technique (MCPWM), Multi Level Inverter (MLI), Maximum Power Point Tracking (MPPT), Perturb and Observer (P&O)

Procedia PDF Downloads 563
1593 Maximum Power Point Tracking for Small Scale Wind Turbine Using Multilayer Perceptron Neural Network Implementation without Mechanical Sensor

Authors: Piyangkun Kukutapan, Siridech Boonsang

Abstract:

The article proposes maximum power point tracking without mechanical sensor using Multilayer Perceptron Neural Network (MLPNN). The aim of article is to reduce the cost and complexity but still retain efficiency. The experimental is that duty cycle is generated maximum power, if it has suitable qualification. The measured data from DC generator, voltage (V), current (I), power (P), turnover rate of power (dP), and turnover rate of voltage (dV) are used as input for MLPNN model. The output of this model is duty cycle for driving the converter. The experiment implemented using Arduino Uno board. This diagram is compared to MPPT using MLPNN and P&O control (Perturbation and Observation control). The experimental results show that the proposed MLPNN based approach is more efficiency than P&O algorithm for this application.

Keywords: maximum power point tracking, multilayer perceptron netural network, optimal duty cycle, DC generator

Procedia PDF Downloads 310
1592 A Review of Encryption Algorithms Used in Cloud Computing

Authors: Derick M. Rakgoale, Topside E. Mathonsi, Vusumuzi Malele

Abstract:

Cloud computing offers distributed online and on-demand computational services from anywhere in the world. Cloud computing services have grown immensely over the past years, especially in the past year due to the Coronavirus pandemic. Cloud computing has changed the working environment and introduced work from work phenomenon, which enabled the adoption of technologies to fulfill the new workings, including cloud services offerings. The increased cloud computing adoption has come with new challenges regarding data privacy and its integrity in the cloud environment. Previously advanced encryption algorithms failed to reduce the memory space required for cloud computing performance, thus increasing the computational cost. This paper reviews the existing encryption algorithms used in cloud computing. In the future, artificial neural networks (ANN) algorithm design will be presented as a security solution to ensure data integrity, confidentiality, privacy, and availability of user data in cloud computing. Moreover, MATLAB will be used to evaluate the proposed solution, and simulation results will be presented.

Keywords: cloud computing, data integrity, confidentiality, privacy, availability

Procedia PDF Downloads 104
1591 Electrochemical Coordination Polymers of Copper(II) Synthesis by Using Rigid and Felexible Ligands

Authors: P. Mirahmadpour, M. H. Banitaba, D. Nematollahi

Abstract:

The chemistry of coordination polymers in recent years has grown exponentially not only because of their interesting architectures but also due to their various technical applications in many fields including ion exchange, chemical catalysis, small molecule separations, and drug release. The use of bridging ligands for the controlled self-assembly of one, two or three dimensional metallo-supramolecular species is the subject of serious study in last decade. Numerous different synthetic methods have been offered for the preparation of coordination polymers such as (a) diffusion from the gas phase, (b) slow diffusion of the reactants into a polymeric matrix, (c) evaporation of the solvent at ambient or reduced temperatures, (d) temperature controlled cooling, (e) precipitation or recrystallisation from a mixture of solvents and (f) hydrothermal synthesis. The electrosynthetic process suggested several advantages over conventional approaches. A general advantage of electrochemical synthesis is that it allows synthesis under milder conditions than typical solvothermal or microwave synthesis. In this work we have introduced a simple electrochemical method for growing metal coordination polymers based on copper with a flexible 2,2’-thiodiacetic acid (TDA) and rigid 1,2,4,5-benzenetetracarboxylate (BTC) ligands. The structure of coordination polymers were characterized by scanning electron microscopy (SEM), X-ray powder diffraction (XRD), elemental analysis, thermal gravimetric (TG) and differential thermal analyses (DTA). The single-crystal X-ray diffraction analysis revealed that different conformations of the ligands and different coordination modes of the carboxylate group as well as different coordination geometries of the copper atoms. Electrochemical synthesis of coordination polymers has different advantages such as faster synthesis at lower temperature in compare with conventional chemical methods and crystallization of desired materials in a single synthetic step.

Keywords: 1, 2, 4, 5-benzenetetracarboxylate, coordination polymer, copper, 2, 2’-thiodiacetic acid

Procedia PDF Downloads 189
1590 Importance-Performance Analysis of Volunteer Tourism in Ethiopia: Host and Guest Case Study

Authors: Zita Fomukong Andam

Abstract:

With a general objective of evaluating the importance and Performance attributes of Volunteer Tourism in Ethiopia and also specifically intending to rank out the importance to evaluate the competitive performance of Ethiopia to host volunteer tourists, laying them in a four quadrant grid and conduct the IPA Iso-Priority Line comparison of Volunteer Tourism in Ethiopia. From hosts and guests point of view, a deeper research discourse was conducted with a randomly selected 384 guests and 165 hosts in Ethiopia. Findings of the discourse through an exploratory research design on both the hosts and the guests confirm that attributes of volunteer tourism generally and marginally fall in the South East quadrant of the matrix where their importance is relatively higher than their performance counterpart, also referred as ‘Concentrate Here’ quadrant. The fact that there are more items in this particular place in both the host and guest study, where they are highly important, but their relative performance is low, strikes a message that the country has more to do. Another focus point of this study is mapping the scores of attributes regarding the guest and host importance and performance against the Iso-Priority Line. Results of Iso-Priority Line Analysis of the IPA of Volunteer Tourism in Ethiopia from the Host’s Perspective showed that there are no attributes where their importance is exactly the same as their performance. With this being found, the fact that this research design inhabits many characters of exploratory nature, it is not confirmed research output. This paper reserves from prescribing anything to the applied world before further confirmatory research is conducted on the issue and rather calls the scientific community to augment this study through comprehensive, exhaustive, extensive and extended works of inquiry in order to get a refined set of recommended items to the applied world.

Keywords: volunteer tourism, competitive performance importance-performance analysis, Ethiopian tourism

Procedia PDF Downloads 206
1589 A New Reliability based Channel Allocation Model in Mobile Networks

Authors: Anujendra, Parag Kumar Guha Thakurta

Abstract:

The data transmission between mobile hosts and base stations (BSs) in Mobile networks are often vulnerable to failure. Thus, efficient link connectivity, in terms of the services of both base stations and communication channels of the network, is required in wireless mobile networks to achieve highly reliable data transmission. In addition, it is observed that the number of blocked hosts is increased due to insufficient number of channels during heavy load in the network. Under such scenario, the channels are allocated accordingly to offer a reliable communication at any given time. Therefore, a reliability-based channel allocation model with acceptable system performance is proposed as a MOO problem in this paper. Two conflicting parameters known as Resource Reuse factor (RRF) and the number of blocked calls are optimized under reliability constraint in this problem. The solution to such MOO problem is obtained through NSGA-II (Non-dominated Sorting Genetic Algorithm). The effectiveness of the proposed model in this work is shown with a set of experimental results.

Keywords: base station, channel, GA, pareto-optimal, reliability

Procedia PDF Downloads 387
1588 Stock Market Integration of Emerging Markets around the Global Financial Crisis: Trends and Explanatory Factors

Authors: Najlae Bendou, Jean-Jacques Lilti, Khalid Elbadraoui

Abstract:

In this paper, we examine stock market integration of emerging markets around the global financial turmoil of 2007-2008. Following Pukthuanthong and Roll (2009), we measure the integration of 46 emerging countries using the adjusted R-square from the regression of each country's daily index returns on global factors extracted from the covariance matrix computed using dollar-denominated daily index returns of 17 developed countries. Our sample surrounds the global financial crisis and ranges between 2000 and 2018. We analyze results using four cohorts of emerging countries: East Asia & Pacific and South Asia, Europe & Central Asia, Latin America & Caribbean, Middle East & Africa. We find that the level of integration of emerging countries increases at the commencement of the crisis and during the booming phase of the business cycles. It reaches a maximum point in the middle of the crisis and then tends to revert to its pre-crisis level. This pattern tends to be common among the four geographic zones investigated in this study. Finally, we investigate the determinants of stock market integration of emerging countries in our sample using panel regressions. Our results suggest that the degree of stock market integration of these countries should be put into perspective by some macro-economic factors, such as the size of the equity market, school enrollment rate, international liquidity level, stocks traded volume, tax revenue level, imports and exports volumes.

Keywords: correlations, determinants of integration, diversification, emerging markets, financial crisis, integration, markets co-movement, panel regressions, r-square, stock markets

Procedia PDF Downloads 164
1587 Application of Association Rule Using Apriori Algorithm for Analysis of Industrial Accidents in 2013-2014 in Indonesia

Authors: Triano Nurhikmat

Abstract:

Along with the progress of science and technology, the development of the industrialized world in Indonesia took place very rapidly. This leads to a process of industrialization of society Indonesia faster with the establishment of the company and the workplace are diverse. Development of the industry relates to the activity of the worker. Where in these work activities do not cover the possibility of an impending crash on either the workers or on a construction project. The cause of the occurrence of industrial accidents was the fault of electrical damage, work procedures, and error technique. The method of an association rule is one of the main techniques in data mining and is the most common form used in finding the patterns of data collection. In this research would like to know how relations of the association between the incidence of any industrial accidents. Therefore, by using methods of analysis association rule patterns associated with combination obtained two iterations item set (2 large item set) when every factor of industrial accidents with a West Jakarta so industrial accidents caused by the occurrence of an electrical value damage = 0.2 support and confidence value = 1, and the reverse pattern with value = 0.2 support and confidence = 0.75.

Keywords: association rule, data mining, industrial accidents, rules

Procedia PDF Downloads 272
1586 The Influence of Beta Shape Parameters in Project Planning

Authors: Αlexios Kotsakis, Stefanos Katsavounis, Dimitra Alexiou

Abstract:

Networks can be utilized to represent project planning problems, using nodes for activities and arcs to indicate precedence relationship between them. For fixed activity duration, a simple algorithm calculates the amount of time required to complete a project, followed by the activities that comprise the critical path. Program Evaluation and Review Technique (PERT) generalizes the above model by incorporating uncertainty, allowing activity durations to be random variables, producing nevertheless a relatively crude solution in planning problems. In this paper, based on the findings of the relevant literature, which strongly suggests that a Beta distribution can be employed to model earthmoving activities, we utilize Monte Carlo simulation, to estimate the project completion time distribution and measure the influence of skewness, an element inherent in activities of modern technical projects. We also extract the activity criticality index, with an ultimate goal to produce more accurate planning estimations.

Keywords: beta distribution, PERT, Monte Carlo simulation, skewness, project completion time distribution

Procedia PDF Downloads 131
1585 Challenges of Sustainable Development of Small and Medium-Sized Enterprises in Georgia

Authors: Kharaishvili Eteri

Abstract:

The article highlights the importance of small and medium-sized enterprises in achieving the goals of sustainable development of the economy and increasing the well-being of the population. The opinion is put forward that it is necessary to adapt the activities of small and medium-sized firms in Georgia to sustainable business models. Therefore, it is important to identify the challenges that will ensure compliance with the goals and requirements of sustainable development of small and mediumsized enterprises. Objectives. The goal of the study is to reveal the challenges of sustainable development in small and medium-sized enterprises in Georgia and to develop recommendations for strategic development opportunities. Methodologies The challenges of sustainable development of small and medium-sized enterprises are investigated with the following methodology: bibliographic research of scientific works and reports of organizations is carried out; Based on the grouping of sustainable development goals, the performance indicators of these goals are studied; Differences with respect to the corresponding indicators of European countries are determined by the comparison method; The matrix scheme establishes the conditions and tools for sustainable development; Challenges of sustainable development are identified by factor analysis. Contributions Trends in the sustainable development of small and medium-sized enterprises are studied from the point of view of economic, social and environmental factors; To ensure sustainability, the conditions and tools for sustainable development are established (certified supply chains and global markets, allocation of financial resources necessary for sustainable development, proper public procurement, highly qualified workforce, etc.); Several main challenges have been identified in the sustainable development of small and medium-sized enterprises, including: limited internal resources; Institutional factors, especially vague and imperfect regulations, bureaucracy; low level of investments; Low level of qualification of human capital and others.

Keywords: small and medium-sized enterprises, sustainable development, conditions of sustainable development, strategic directions of sustainable development.

Procedia PDF Downloads 76
1584 Modified Silicates as Dissolved Oxygen Sensors in Water: Structural and Optical Properties

Authors: Andile Mkhohlakali, Tien-Chien Jen, James Tshilongo, Happy Mabowa

Abstract:

Among different parameters, oxygen is one of the most important analytes of interest, dissolved oxygen (DO) concentration is very crucial and significant for various areas of physical, chemical, and environmental monitoring. Herein we report oxygen-sensitive luminophores -based lanthanum(III) trifluoromethanesulfonate), [La]³⁺ was encapsulated into SiO₂-based xerogel matrix. The nanosensor is composed of organically modified silica nanoparticles, doped with the luminescent oxygen–sensitive lanthanum(III) trifluoromethanesulfonate complex. The precursor materials used for sensing film were triethyl ethoxy silane (TEOS) and (3-Mercaptopropyltriethoxysilane) (MPTMS- TEOS) used for SiO2-baed matrices. Brunauer–Emmett–Teller (BET), and BJH indicate that the SiO₂ transformed from microporous to mesoporous upon the addition of La³⁺ luminophore with increased surface area (SBET). The typical amorphous SiO₂ based xerogels were revealed with X-Ray diffraction (XRD) and Selected Area Electron Diffraction (SAED) analysis. Scanning electron microscope- (SEM) and transmission electron microscope (TEM) showed the porous morphology and reduced particle for SiO₂ and La-SiO₂ xerogels respectively. The existence of elements, siloxane networks, and thermal stability of xerogel was confirmed by energy dispersive spectroscopy (EDS), Fourier-transform infrared spectroscopy (FTIR), and Thermographic analysis (TGA). UV-Vis spectroscopy and photoluminescence (PL) have been used to characterize the optical properties of xerogels. La-SiO₂ demonstrates promising characteristic features of an active sensing film for dissolved oxygen in the water. Keywords: Sol-gel, ORMOSILs, encapsulation, Luminophores quenching, O₂-sensing

Keywords: sol-gel, ORMOSILs, luminophores quenching, O₂-sensing

Procedia PDF Downloads 106
1583 'Low Electronic Noise' Detector Technology in Computed Tomography

Authors: A. Ikhlef

Abstract:

Image noise in computed tomography, is mainly caused by the statistical noise, system noise reconstruction algorithm filters. Since last few years, low dose x-ray imaging became more and more desired and looked as a technical differentiating technology among CT manufacturers. In order to achieve this goal, several technologies and techniques are being investigated, including both hardware (integrated electronics and photon counting) and software (artificial intelligence and machine learning) based solutions. From a hardware point of view, electronic noise could indeed be a potential driver for low and ultra-low dose imaging. We demonstrated that the reduction or elimination of this term could lead to a reduction of dose without affecting image quality. Also, in this study, we will show that we can achieve this goal using conventional electronics (low cost and affordable technology), designed carefully and optimized for maximum detective quantum efficiency. We have conducted the tests using large imaging objects such as 30 cm water and 43 cm polyethylene phantoms. We compared the image quality with conventional imaging protocols with radiation as low as 10 mAs (<< 1 mGy). Clinical validation of such results has been performed as well.

Keywords: computed tomography, electronic noise, scintillation detector, x-ray detector

Procedia PDF Downloads 104
1582 Inverse Problem Method for Microwave Intrabody Medical Imaging

Authors: J. Chamorro-Servent, S. Tassani, M. A. Gonzalez-Ballester, L. J. Roca, J. Romeu, O. Camara

Abstract:

Electromagnetic and microwave imaging (MWI) have been used in medical imaging in the last years, being the most common applications of breast cancer and stroke detection or monitoring. In those applications, the subject or zone to observe is surrounded by a number of antennas, and the Nyquist criterium can be satisfied. Additionally, the space between the antennas (transmitting and receiving the electromagnetic fields) and the zone to study can be prepared in a homogeneous scenario. However, this may differ in other cases as could be intracardiac catheters, stomach monitoring devices, pelvic organ systems, liver ablation monitoring devices, or uterine fibroids’ ablation systems. In this work, we analyzed different MWI algorithms to find the most suitable method for dealing with an intrabody scenario. Due to the space limitations usually confronted on those applications, the device would have a cylindrical configuration of a maximum of eight transmitters and eight receiver antennas. This together with the positioning of the supposed device inside a body tract impose additional constraints in order to choose a reconstruction method; for instance, it inhabitants the use of well-known algorithms such as filtered backpropagation for diffraction tomography (due to the unusual configuration with probes enclosed by the imaging region). Finally, the difficulty of simulating a realistic non-homogeneous background inside the body (due to the incomplete knowledge of the dielectric properties of other tissues between the antennas’ position and the zone to observe), also prevents the use of Born and Rytov algorithms due to their limitations with a heterogeneous background. Instead, we decided to use a time-reversed algorithm (mostly used in geophysics) due to its characteristics of ignoring heterogeneities in the background medium, and of focusing its generated field onto the scatters. Therefore, a 2D time-reversed finite difference time domain was developed based on the time-reversed approach for microwave breast cancer detection. Simultaneously an in-silico testbed was also developed to compare ground-truth dielectric properties with corresponding microwave imaging reconstruction. Forward and inverse problems were computed varying: the frequency used related to a small zone to observe (7, 7.5 and 8 GHz); a small polyp diameter (5, 7 and 10 mm); two polyp positions with respect to the closest antenna (aligned or disaligned); and the (transmitters-to-receivers) antenna combination used for the reconstruction (1-1, 8-1, 8-8 or 8-3). Results indicate that when using the existent time-reversed method for breast cancer here for the different combinations of transmitters and receivers, we found false positives due to the high degrees of freedom and unusual configuration (and the possible violation of Nyquist criterium). Those false positives founded in 8-1 and 8-8 combinations, highly reduced with the 1-1 and 8-3 combination, being the 8-3 configuration de most suitable (three neighboring receivers at each time). The 8-3 configuration creates a region-of-interest reduced problem, decreasing the ill-posedness of the inverse problem. To conclude, the proposed algorithm solves the main limitations of the described intrabody application, successfully detecting the angular position of targets inside the body tract.

Keywords: FDTD, time-reversed, medical imaging, microwave imaging

Procedia PDF Downloads 106
1581 Machine Learning Approach for Yield Prediction in Semiconductor Production

Authors: Heramb Somthankar, Anujoy Chakraborty

Abstract:

This paper presents a classification study on yield prediction in semiconductor production using machine learning approaches. A complicated semiconductor production process is generally monitored continuously by signals acquired from sensors and measurement sites. A monitoring system contains a variety of signals, all of which contain useful information, irrelevant information, and noise. In the case of each signal being considered a feature, "Feature Selection" is used to find the most relevant signals. The open-source UCI SECOM Dataset provides 1567 such samples, out of which 104 fail in quality assurance. Feature extraction and selection are performed on the dataset, and useful signals were considered for further study. Afterward, common machine learning algorithms were employed to predict whether the signal yields pass or fail. The most relevant algorithm is selected for prediction based on the accuracy and loss of the ML model.

Keywords: deep learning, feature extraction, feature selection, machine learning classification algorithms, semiconductor production monitoring, signal processing, time-series analysis

Procedia PDF Downloads 94
1580 Multi-Model Super Ensemble Based Advanced Approaches for Monsoon Rainfall Prediction

Authors: Swati Bhomia, C. M. Kishtawal, Neeru Jaiswal

Abstract:

Traditionally, monsoon forecasts have encountered many difficulties that stem from numerous issues such as lack of adequate upper air observations, mesoscale nature of convection, proper resolution, radiative interactions, planetary boundary layer physics, mesoscale air-sea fluxes, representation of orography, etc. Uncertainties in any of these areas lead to large systematic errors. Global circulation models (GCMs), which are developed independently at different institutes, each of which carries somewhat different representation of the above processes, can be combined to reduce the collective local biases in space, time, and for different variables from different models. This is the basic concept behind the multi-model superensemble and comprises of a training and a forecast phase. The training phase learns from the recent past performances of models and is used to determine statistical weights from a least square minimization via a simple multiple regression. These weights are then used in the forecast phase. The superensemble forecasts carry the highest skill compared to simple ensemble mean, bias corrected ensemble mean and the best model out of the participating member models. This approach is a powerful post-processing method for the estimation of weather forecast parameters reducing the direct model output errors. Although it can be applied successfully to the continuous parameters like temperature, humidity, wind speed, mean sea level pressure etc., in this paper, this approach is applied to rainfall, a parameter quite difficult to handle with standard post-processing methods, due to its high temporal and spatial variability. The present study aims at the development of advanced superensemble schemes comprising of 1-5 day daily precipitation forecasts from five state-of-the-art global circulation models (GCMs), i.e., European Centre for Medium Range Weather Forecasts (Europe), National Center for Environmental Prediction (USA), China Meteorological Administration (China), Canadian Meteorological Centre (Canada) and U.K. Meteorological Office (U.K.) obtained from THORPEX Interactive Grand Global Ensemble (TIGGE), which is one of the most complete data set available. The novel approaches include the dynamical model selection approach in which the selection of the superior models from the participating member models at each grid and for each forecast step in the training period is carried out. Multi-model superensemble based on the training using similar conditions is also discussed in the present study, which is based on the assumption that training with the similar type of conditions may provide the better forecasts in spite of the sequential training which is being used in the conventional multi-model ensemble (MME) approaches. Further, a variety of methods that incorporate a 'neighborhood' around each grid point which is available in literature to allow for spatial error or uncertainty, have also been experimented with the above mentioned approaches. The comparison of these schemes with respect to the observations verifies that the newly developed approaches provide more unified and skillful prediction of the summer monsoon (viz. June to September) rainfall compared to the conventional multi-model approach and the member models.

Keywords: multi-model superensemble, dynamical model selection, similarity criteria, neighborhood technique, rainfall prediction

Procedia PDF Downloads 119
1579 Local Texture and Global Color Descriptors for Content Based Image Retrieval

Authors: Tajinder Kaur, Anu Bala

Abstract:

An image retrieval system is a computer system for browsing, searching, and retrieving images from a large database of digital images a new algorithm meant for content-based image retrieval (CBIR) is presented in this paper. The proposed method combines the color and texture features which are extracted the global and local information of the image. The local texture feature is extracted by using local binary patterns (LBP), which are evaluated by taking into consideration of local difference between the center pixel and its neighbors. For the global color feature, the color histogram (CH) is used which is calculated by RGB (red, green, and blue) spaces separately. In this paper, the combination of color and texture features are proposed for content-based image retrieval. The performance of the proposed method is tested on Corel 1000 database which is the natural database. The results after being investigated show a significant improvement in terms of their evaluation measures as compared to LBP and CH.

Keywords: color, texture, feature extraction, local binary patterns, image retrieval

Procedia PDF Downloads 340
1578 Alternative of Lead-Based Ionization Radiation Shielding Property: Epoxy-Based Composite Design

Authors: Md. Belal Uudin Rabbi, Sakib Al Montasir, Saifur Rahman, Niger Nahid, Esmail Hossain Emon

Abstract:

The practice of radiation shielding protects against the detrimental effects of ionizing radiation. Radiation shielding depletes radiation by inserting a shield of absorbing material between any radioactive source. It is a primary concern when building several industrial fields, so using potent (high activity) radioisotopes in food preservation, cancer treatment, and particle accelerator facilities is significant. Radiation shielding is essential for radiation-emitting equipment users to reduce or mitigate radiation damage. Polymer composites (especially epoxy based) with high atomic number fillers can replace toxic Lead in ionizing radiation shielding applications because of their excellent mechanical properties, superior solvent and chemical resistance, good dimensional stability, adhesive, and less toxic. Due to being lightweight, good neutron shielding ability in almost the same order as concrete, epoxy-based radiation shielding can be the next big thing. Micro and nano-particles for the epoxy resin increase the epoxy matrix's radiation shielding property. Shielding is required to protect users of such facilities from ionizing radiation as recently, and considerable attention has been paid to polymeric composites as a radiation shielding material. This research will examine the radiation shielding performance of epoxy-based nano-WO3 reinforced composites, exploring the performance of epoxy-based nano-WO3 reinforced composites. The samples will be prepared using the direct pouring method to block radiation. The practice of radiation shielding protects against the detrimental effects of ionizing radiation.

Keywords: radiation shielding materials, ionizing radiation, epoxy resin, Tungsten oxide, polymer composites

Procedia PDF Downloads 90
1577 Clustering Based Level Set Evaluation for Low Contrast Images

Authors: Bikshalu Kalagadda, Srikanth Rangu

Abstract:

The important object of images segmentation is to extract objects with respect to some input features. One of the important methods for image segmentation is Level set method. Generally medical images and synthetic images with low contrast of pixel profile, for such images difficult to locate interested features in images. In conventional level set function, develops irregularity during its process of evaluation of contour of objects, this destroy the stability of evolution process. For this problem a remedy is proposed, a new hybrid algorithm is Clustering Level Set Evolution. Kernel fuzzy particles swarm optimization clustering with the Distance Regularized Level Set (DRLS) and Selective Binary, and Gaussian Filtering Regularized Level Set (SBGFRLS) methods are used. The ability of identifying different regions becomes easy with improved speed. Efficiency of the modified method can be evaluated by comparing with the previous method for similar specifications. Comparison can be carried out by considering medical and synthetic images.

Keywords: segmentation, clustering, level set function, re-initialization, Kernel fuzzy, swarm optimization

Procedia PDF Downloads 334
1576 Rule-Of-Mixtures: Predicting the Bending Modulus of Unidirectional Fiber Reinforced Dental Composites

Authors: Niloofar Bahramian, Mohammad Atai, Mohammad Reza Naimi-Jamal

Abstract:

Rule of mixtures is the simple analytical model is used to predict various properties of composites before design. The aim of this study was to demonstrate the benefits and limitations of the Rule-of-Mixtures (ROM) for predicting bending modulus of a continuous and unidirectional fiber reinforced composites using in dental applications. The Composites were fabricated from light curing resin (with and without silica nanoparticles) and modified and non-modified fibers. Composite samples were divided into eight groups with ten specimens for each group. The bending modulus (flexural modulus) of samples was determined from the slope of the initial linear region of stress-strain curve on 2mm×2mm×25mm specimens with different designs: fibers corona treatment time (0s, 5s, 7s), fibers silane treatment (0%wt, 2%wt), fibers volume fraction (41%, 33%, 25%) and nanoparticles incorporation in resin (0%wt, 10%wt, 15%wt). To study the fiber and matrix interface after fracture, single edge notch beam (SENB) method and scanning electron microscope (SEM) were used. SEM also was used to show the nanoparticles dispersion in resin. Experimental results of bending modulus for composites made of both physical (corona) and chemical (silane) treated fibers were in reasonable agreement with linear ROM estimates, but untreated fibers or non-optimized treated fibers and poor nanoparticles dispersion did not correlate as well with ROM results. This study shows that the ROM is useful to predict the mechanical behavior of unidirectional dental composites but fiber-resin interface and quality of nanoparticles dispersion play important role in ROM accurate predictions.

Keywords: bending modulus, fiber reinforced composite, fiber treatment, rule-of-mixtures

Procedia PDF Downloads 259
1575 Investigating Potential Pest Management Strategies for Citrus Gall Wasp in Australia

Authors: M. Yazdani, J. F. Carragher

Abstract:

Citrus gall wasp (CGW), Bruchophagus fellis (Hym: Eurytomidae), is an Australian native insect pest. CGW has now become a problem of national concern, threatening the viability of the entire Australian citrus industry. However, CGW appears to exhibit a preference for certain citrus species; growers report that grapefruit and lemons are most severely infested, with oranges and mandarins affected to a lesser extent. Given the specificity of the host plant-insect interactions, it is speculated that plant volatiles may play a significant role in host recognition. To address whether plant volatiles is involved in host plant preference by CGW we tested the behavioral response of CGW to plants in a wind tunnel. The result showed that CGW had significantly higher preference to grapefruit and lemon than other cultivars and the least preference was recorded to mandarin (Chi-square test, P<0.001). Because CGW exhibited a detectable choice further studies were undertaken to identify the components of the volatiles from each species. We trapped the volatile chemicals emitted by a 30 cm tip of each plant onto a solid Porapak matrix. Eluted extracts were then analysed by Gas Chromatography-Mass Spectrometry (GCMS) and the presumptive identity of the major compounds from each species inferred from the MS library. Although the same major compounds existed in all of the cultivars, the relative ratios of them differed between species. Next, we will validate the identity of the key volatiles using authentic standards and establish their ability to elicit olfactory responses in CGW in wind tunnel and field experiments. Identification of semiochemicals involved in host location by CGW is of interest not only from an ecological perspective but also for the development of novel pest control strategies.

Keywords: Citrus gall wasp, Bruchophagus fellis, volatiles, semiochemicals, IPM

Procedia PDF Downloads 211
1574 Spatial-Temporal Awareness Approach for Extensive Re-Identification

Authors: Tyng-Rong Roan, Fuji Foo, Wenwey Hseush

Abstract:

Recent development of AI and edge computing plays a critical role to capture meaningful events such as detection of an unattended bag. One of the core problems is re-identification across multiple CCTVs. Immediately following the detection of a meaningful event is to track and trace the objects related to the event. In an extensive environment, the challenge becomes severe when the number of CCTVs increases substantially, imposing difficulties in achieving high accuracy while maintaining real-time performance. The algorithm that re-identifies cross-boundary objects for extensive tracking is referred to Extensive Re-Identification, which emphasizes the issues related to the complexity behind a great number of CCTVs. The Spatial-Temporal Awareness approach challenges the conventional thinking and concept of operations which is labor intensive and time consuming. The ability to perform Extensive Re-Identification through a multi-sensory network provides the next-level insights – creating value beyond traditional risk management.

Keywords: long-short-term memory, re-identification, security critical application, spatial-temporal awareness

Procedia PDF Downloads 97
1573 Optimal Bayesian Chart for Controlling Expected Number of Defects in Production Processes

Authors: V. Makis, L. Jafari

Abstract:

In this paper, we develop an optimal Bayesian chart to control the expected number of defects per inspection unit in production processes with long production runs. We formulate this control problem in the optimal stopping framework. The objective is to determine the optimal stopping rule minimizing the long-run expected average cost per unit time considering partial information obtained from the process sampling at regular epochs. We prove the optimality of the control limit policy, i.e., the process is stopped and the search for assignable causes is initiated when the posterior probability that the process is out of control exceeds a control limit. An algorithm in the semi-Markov decision process framework is developed to calculate the optimal control limit and the corresponding average cost. Numerical examples are presented to illustrate the developed optimal control chart and to compare it with the traditional u-chart.

Keywords: Bayesian u-chart, economic design, optimal stopping, semi-Markov decision process, statistical process control

Procedia PDF Downloads 555
1572 Green Design Study of Prefabricated Community Control Measures in Response to Public Health Emergencies

Authors: Enjia Zhang

Abstract:

During the prevention and control of the COVID-19 pandemic, all communities in China were gated and under strict management, which was highly effective in preventing the spread of the epidemic from spreading. Based on the TRIZ theory, this paper intends to propose green design strategies of community control in response to public health emergencies and to optimize community control facilities according to the principle of minimum transformation. Through the questionnaire method, this paper investigates and summarizes the situation and problems of community control during the COVID-19 pandemic. Based on these problems, the TRIZ theory is introduced to figure out the problems and associates them with prefabricated facilities. Afterward, the innovation points and solutions of prefabricated community control measures are proposed by using the contradiction matrix. This paper summarizes the current situation of community control under public health emergencies and concludes the problems such as simple forms of temporary roadblocks, sudden increase of community traffic pressure, and difficulties to access public spaces. The importance of entrance and exit control in community control is emphasized. Therefore, the community control measures are supposed to focus on traffic control, and the external access control measures, including motor vehicles, non-motor vehicles, residents and non-residents access control, and internal public space access control measures, including public space control shared with the society or adjacent communities, are proposed in order to make the community keep the open characteristics and have the flexibility to deal with sudden public health emergencies in the future.

Keywords: green design, community control, prefabricated structure, public health emergency

Procedia PDF Downloads 108
1571 Oil Reservoir Asphalting Precipitation Estimating during CO2 Injection

Authors: I. Alhajri, G. Zahedi, R. Alazmi, A. Akbari

Abstract:

In this paper, an Artificial Neural Network (ANN) was developed to predict Asphaltene Precipitation (AP) during the injection of carbon dioxide into crude oil reservoirs. In this study, the experimental data from six different oil fields were collected. Seventy percent of the data was used to develop the ANN model, and different ANN architectures were examined. A network with the Trainlm training algorithm was found to be the best network to estimate the AP. To check the validity of the proposed model, the model was used to predict the AP for the thirty percent of the data that was unevaluated. The Mean Square Error (MSE) of the prediction was 0.0018, which confirms the excellent prediction capability of the proposed model. In the second part of this study, the ANN model predictions were compared with modified Hirschberg model predictions. The ANN was found to provide more accurate estimates compared to the modified Hirschberg model. Finally, the proposed model was employed to examine the effect of different operating parameters during gas injection on the AP. It was found that the AP is mostly sensitive to the reservoir temperature. Furthermore, the carbon dioxide concentration in liquid phase increases the AP.

Keywords: artificial neural network, asphaltene, CO2 injection, Hirschberg model, oil reservoirs

Procedia PDF Downloads 352
1570 Research on Knowledge Graph Inference Technology Based on Proximal Policy Optimization

Authors: Yihao Kuang, Bowen Ding

Abstract:

With the increasing scale and complexity of knowledge graph, modern knowledge graph contains more and more types of entity, relationship, and attribute information. Therefore, in recent years, it has been a trend for knowledge graph inference to use reinforcement learning to deal with large-scale, incomplete, and noisy knowledge graphs and improve the inference effect and interpretability. The Proximal Policy Optimization (PPO) algorithm utilizes a near-end strategy optimization approach. This allows for more extensive updates of policy parameters while constraining the update extent to maintain training stability. This characteristic enables PPOs to converge to improved strategies more rapidly, often demonstrating enhanced performance early in the training process. Furthermore, PPO has the advantage of offline learning, effectively utilizing historical experience data for training and enhancing sample utilization. This means that even with limited resources, PPOs can efficiently train for reinforcement learning tasks. Based on these characteristics, this paper aims to obtain a better and more efficient inference effect by introducing PPO into knowledge inference technology.

Keywords: reinforcement learning, PPO, knowledge inference

Procedia PDF Downloads 216
1569 Development of Computational Approach for Calculation of Hydrogen Solubility in Hydrocarbons for Treatment of Petroleum

Authors: Abdulrahman Sumayli, Saad M. AlShahrani

Abstract:

For the hydrogenation process, knowing the solubility of hydrogen (H2) in hydrocarbons is critical to improve the efficiency of the process. We investigated the H2 solubility computation in four heavy crude oil feedstocks using machine learning techniques. Temperature, pressure, and feedstock type were considered as the inputs to the models, while the hydrogen solubility was the sole response. Specifically, we employed three different models: Support Vector Regression (SVR), Gaussian process regression (GPR), and Bayesian ridge regression (BRR). To achieve the best performance, the hyper-parameters of these models are optimized using the whale optimization algorithm (WOA). We evaluated the models using a dataset of solubility measurements in various feedstocks, and we compared their performance based on several metrics. Our results show that the WOA-SVR model tuned with WOA achieves the best performance overall, with an RMSE of 1.38 × 10− 2 and an R-squared of 0.991. These findings suggest that machine learning techniques can provide accurate predictions of hydrogen solubility in different feedstocks, which could be useful in the development of hydrogen-related technologies. Besides, the solubility of hydrogen in the four heavy oil fractions is estimated in different ranges of temperatures and pressures of 150 ◦C–350 ◦C and 1.2 MPa–10.8 MPa, respectively

Keywords: temperature, pressure variations, machine learning, oil treatment

Procedia PDF Downloads 53