Search results for: modified simplex algorithm
4125 Design of Non-uniform Circular Antenna Arrays Using Firefly Algorithm for Side Lobe Level Reduction
Authors: Gopi Ram, Durbadal Mandal, Rajib Kar, Sakti Prasad Ghoshal
Abstract:
A design problem of non-uniform circular antenna arrays for maximum reduction of both the side lobe level (SLL) and first null beam width (FNBW) is dealt with. This problem is modeled as a simple optimization problem. The method of Firefly algorithm (FFA) is used to determine an optimal set of current excitation weights and antenna inter-element separations that provide radiation pattern with maximum SLL reduction and much improvement on FNBW as well. Circular array antenna laid on x-y plane is assumed. FFA is applied on circular arrays of 8-, 10-, and 12- elements. Various simulation results are presented and hence performances of side lobe and FNBW are analyzed. Experimental results show considerable reductions of both the SLL and FNBW with respect to those of the uniform case and some standard algorithms GA, PSO, and SA applied to the same problem.Keywords: circular arrays, first null beam width, side lobe level, FFA
Procedia PDF Downloads 2604124 A Novel Algorithm for Parsing IFC Models
Authors: Raninder Kaur Dhillon, Mayur Jethwa, Hardeep Singh Rai
Abstract:
Information technology has made a pivotal progress across disparate disciplines, one of which is AEC (Architecture, Engineering and Construction) industry. CAD is a form of computer-aided building modulation that architects, engineers and contractors use to create and view two- and three-dimensional models. The AEC industry also uses building information modeling (BIM), a newer computerized modeling system that can create four-dimensional models; this software can greatly increase productivity in the AEC industry. BIM models generate open source IFC (Industry Foundation Classes) files which aim for interoperability for exchanging information throughout the project lifecycle among various disciplines. The methods developed in previous studies require either an IFC schema or MVD and software applications, such as an IFC model server or a Building Information Modeling (BIM) authoring tool, to extract a partial or complete IFC instance model. This paper proposes an efficient algorithm for extracting a partial and total model from an Industry Foundation Classes (IFC) instance model without an IFC schema or a complete IFC model view definition (MVD). Procedia PDF Downloads 3004123 Text Analysis to Support Structuring and Modelling a Public Policy Problem-Outline of an Algorithm to Extract Inferences from Textual Data
Authors: Claudia Ehrentraut, Osama Ibrahim, Hercules Dalianis
Abstract:
Policy making situations are real-world problems that exhibit complexity in that they are composed of many interrelated problems and issues. To be effective, policies must holistically address the complexity of the situation rather than propose solutions to single problems. Formulating and understanding the situation and its complex dynamics, therefore, is a key to finding holistic solutions. Analysis of text based information on the policy problem, using Natural Language Processing (NLP) and Text analysis techniques, can support modelling of public policy problem situations in a more objective way based on domain experts knowledge and scientific evidence. The objective behind this study is to support modelling of public policy problem situations, using text analysis of verbal descriptions of the problem. We propose a formal methodology for analysis of qualitative data from multiple information sources on a policy problem to construct a causal diagram of the problem. The analysis process aims at identifying key variables, linking them by cause-effect relationships and mapping that structure into a graphical representation that is adequate for designing action alternatives, i.e., policy options. This study describes the outline of an algorithm used to automate the initial step of a larger methodological approach, which is so far done manually. In this initial step, inferences about key variables and their interrelationships are extracted from textual data to support a better problem structuring. A small prototype for this step is also presented.Keywords: public policy, problem structuring, qualitative analysis, natural language processing, algorithm, inference extraction
Procedia PDF Downloads 5904122 Fluorescence Effect of Carbon Dots Modified with Silver Nanoparticles
Authors: Anna Piasek, Anna Szymkiewicz, Gabriela Wiktor, Jolanta Pulit-Prociak, Marcin Banach
Abstract:
Carbon dots (CDs) have great potential for application in many fields of science. They are characterized by fluorescent properties that can be manipulated. The nanomaterial has many advantages in addition to its unique properties. CDs may be obtained easily, and they undergo surface functionalization in a simple way. In addition, there is a wide range of raw materials that can be used for their synthesis. An interesting possibility is the use of numerous waste materials of natural origin. In the research presented here, the synthesis of CDs was carried out according to the principles of Green chemistry. Beet molasses was used as a natural raw material. It has a high sugar content. This makes it an excellent high-carbon precursor for obtaining CDs. To increase the fluorescence effect, we modified the surface of CDs with silver (Ag-CDs) nanoparticles. The process of obtaining CQD was based on the hydrothermal method by applying microwave radiation. Silver nanoparticles were formed via the chemical reduction method. The synthesis plans were performed on the Design of the Experimental method (DoE). Variable process parameters such as concentration of beet molasses, temperature and concentration of nanosilver were used in these syntheses. They affected the obtained properties and particle parameters. The Ag-CDs were analyzed by UV-vis spectroscopy. The fluorescence properties and selection of the appropriate excitation light wavelength were performed by spectrofluorimetry. Particle sizes were checked using the DLS method. The influence of the input parameters on the obtained results was also studied.Keywords: fluorescence, modification, nanosilver, molasses, Green chemistry, carbon dots
Procedia PDF Downloads 854121 Public Participation in Science: The Case of Genetic Modified Organisms in Brazil
Authors: Maria Luisa Nozawa Ribeiro, Maria Teresa Miceli Kerbauy
Abstract:
This paper aims to present the theories of public participation in order to understand the context of the public GMO (Genetic Modified Organisms) policies in Brazil, highlighting the characteristics of its configuration and the dialog with the experts. As a controversy subject, the commercialization of GMO provoked manifestation of some popular and environmental representative groups questioning the decisions of policy makers and experts on the matter. Many aspects and consequences of the plantation and consumption of this crops emerged and the safety of this technology was questioned. Environmentalists, Civil Right's movement, representatives of rural workers, farmers and organics producers, etc. demonstrated their point of view, also sustained by some experts of medical, genetical, environmental, agronomical sciences, etc. fields. Despite this movement, the precautionary principle (risk management), implemented in 1987, suggested precaution facing new technologies and innovations in the sustainable development society. This principle influenced many legislation and regulation on GMO around the world, including Brazil, which became a reference among the world regulatory GMO systems. The Brazilian legislation ensures the citizens participation on GMO discussion, characteristic that was important to establish the connection between the subject and the participation theory. These deliberation spaces materialized in Brazil through the "Public Audiences", which are managed by the National Biosafety Technical Commission (CTNBio), the department responsible for controlling the research, production and commercialization of GMOs in Brazil.Keywords: public engagement, public participation, science and technology studies, transgenic politics
Procedia PDF Downloads 3064120 Structural Invertibility and Optimal Sensor Node Placement for Error and Input Reconstruction in Dynamic Systems
Authors: Maik Kschischo, Dominik Kahl, Philipp Wendland, Andreas Weber
Abstract:
Understanding and modelling of real-world complex dynamic systems in biology, engineering and other fields is often made difficult by incomplete knowledge about the interactions between systems states and by unknown disturbances to the system. In fact, most real-world dynamic networks are open systems receiving unknown inputs from their environment. To understand a system and to estimate the state dynamics, these inputs need to be reconstructed from output measurements. Reconstructing the input of a dynamic system from its measured outputs is an ill-posed problem if only a limited number of states is directly measurable. A first requirement for solving this problem is the invertibility of the input-output map. In our work, we exploit the fact that invertibility of a dynamic system is a structural property, which depends only on the network topology. Therefore, it is possible to check for invertibility using a structural invertibility algorithm which counts the number of node disjoint paths linking inputs and outputs. The algorithm is efficient enough, even for large networks up to a million nodes. To understand structural features influencing the invertibility of a complex dynamic network, we analyze synthetic and real networks using the structural invertibility algorithm. We find that invertibility largely depends on the degree distribution and that dense random networks are easier to invert than sparse inhomogeneous networks. We show that real networks are often very difficult to invert unless the sensor nodes are carefully chosen. To overcome this problem, we present a sensor node placement algorithm to achieve invertibility with a minimum set of measured states. This greedy algorithm is very fast and also guaranteed to find an optimal sensor node-set if it exists. Our results provide a practical approach to experimental design for open, dynamic systems. Since invertibility is a necessary condition for unknown input observers and data assimilation filters to work, it can be used as a preprocessing step to check, whether these input reconstruction algorithms can be successful. If not, we can suggest additional measurements providing sufficient information for input reconstruction. Invertibility is also important for systems design and model building. Dynamic models are always incomplete, and synthetic systems act in an environment, where they receive inputs or even attack signals from their exterior. Being able to monitor these inputs is an important design requirement, which can be achieved by our algorithms for invertibility analysis and sensor node placement.Keywords: data-driven dynamic systems, inversion of dynamic systems, observability, experimental design, sensor node placement
Procedia PDF Downloads 1524119 Flashover Detection Algorithm Based on Mother Function
Authors: John A. Morales, Guillermo Guidi, B. M. Keune
Abstract:
Electric Power supply is a crucial topic for economic and social development. Power outages statistics show that discharges atmospherics are imperative phenomena to produce those outages. In this context, it is necessary to correctly detect when overhead line insulators are faulted. In this paper, an algorithm to detect if a lightning stroke generates or not permanent fault on insulator strings is proposed. On top of that, lightning stroke simulations developed by using the Alternative Transients Program, are used. Based on these insights, a novel approach is designed that depends on mother functions analysis corresponding to the given variance-covariance matrix. Signals registered at the insulator string are projected on corresponding axes by the means of Principal Component Analysis. By exploiting these new axes, it is possible to determine a flashover characteristic zone useful to a good insulation design. The proposed methodology for flashover detection extends the existing approaches for the analysis and study of lightning performance on transmission lines.Keywords: mother function, outages, lightning, sensitivity analysis
Procedia PDF Downloads 5884118 Modified Acetamidobenzoxazolone Based Biomarker for Translocator Protein Mapping during Neuroinflammation
Authors: Anjani Kumar Tiwari, Neelam Kumari, Anil Mishra
Abstract:
The 18-kDa translocator protein (TSPO) previously called as peripheral benzodiazepine receptor, is proven biomarker for variety of neuroinflammation. TSPO is tryptophane rich five transmembranal protein found on outer mitochondrial membrane of steroid synthesising and immunomodulatory cells. In case of neuronal damage or inflammation the expression level of TSPO get upregulated as an immunomodulatory response. By utilizing Benzoxazolone as a basic scaffold, series of TSPO ligands have been designed followed by their screening through in silico studies. Synthesis has been planned by employing convergent methodology in six high yielding steps. For the synthesized ligands the ‘in vitro’ assay was performed to determine the binding affinity in term of Ki. On ischemic rat brain, autoradiography studies were also carried to check the specificity and affinity of the designed radiolabelled ligand for TSPO.Screening was performed on the basis of GScore of CADD based schrodinger software. All the modified and better prospective compound were successfully carried out and characterized by spectroscopic techniques (FTIR, NMR and HRMS). In vitro binding assay showed best binding affinity Ki = 6.1+ 0.3 for TSPO over central benzodiazepine receptor (CBR) Ki > 200. ARG studies indicated higher uptake of two analogues on the lesion side compared with that on the non-lesion side of ischemic rat brains. Displacement experiments with unlabelled ligand had minimized the difference in uptake between the two sides which indicates the specificity of the ligand towards TSPO receptor.Keywords: TSPO, PET, imaging, Acetamidobenzoxazolone
Procedia PDF Downloads 1434117 Modified Model-Based Systems Engineering Driven Approach for Defining Complex Energy Systems
Authors: Akshay S. Dalvi, Hazim El-Mounayri
Abstract:
The internal and the external interactions between the complex structural and behavioral characteristics of the complex energy system result in unpredictable emergent behaviors. These emergent behaviors are not well understood, especially when modeled using the traditional top-down systems engineering approach. The intrinsic nature of current complex energy systems has called for an elegant solution that provides an integrated framework in Model-Based Systems Engineering (MBSE). This paper mainly presents a MBSE driven approach to define and handle the complexity that arises due to emergent behaviors. The approach provides guidelines for developing system architecture that leverages in predicting the complexity index of the system at different levels of abstraction. A framework that integrates indefinite and definite modeling aspects is developed to determine the complexity that arises during the development phase of the system. This framework provides a workflow for modeling complex systems using Systems Modeling Language (SysML) that captures the system’s requirements, behavior, structure, and analytical aspects at both problem definition and solution levels. A system architecture for a district cooling plant is presented, which demonstrates the ability to predict the complexity index. The result suggests that complex energy systems like district cooling plant can be defined in an elegant manner using the unconventional modified MBSE driven approach that helps in estimating development time and cost.Keywords: district cooling plant, energy systems, framework, MBSE
Procedia PDF Downloads 1314116 Reactive Oxygen Species-Mediated Photoaging Pathways of Ultrafine Plastic Particles under UV Irradiation
Authors: Jiajun Duan, Yang Li, Jianan Gao, Runzi Cao, Enxiang Shang, Wen Zhang
Abstract:
Reactive oxygen species (ROS) generation is considered as an important photoaging mechanism of microplastics (MPs) and nanoplastics (NPs). To elucidate the ROS-induced MP/NP aging processes in water under UV365 irradiation, we examined the effects of surface coatings, polymer types, and grain sizes on ROS generation and photoaging intermediates. Bare polystyrene (PS) NPs generated hydroxyl radicals (•OH) and singlet oxygen (¹O₂), while coated PS NPs (carboxyl-modified PS (PS-COOH), amino-modified PS (PS-NH₂)) and PS MPs generated fewer ROS due to coating scavenging or size effects. Polypropylene, polyethylene, polyvinyl chloride, polyethylene terephthalate, and polycarbonate MPs only generated •OH. For aromatic polymers, •OH addition preferentially occurred at benzene rings to form monohydroxy polymers. Excess •OH resulted in H abstraction, C-C scission, and phenyl ring opening to generate aliphatic ketones, esters, aldehydes, and aromatic ketones. For coated PS NPs, •OH preferentially attacked the surface coatings to result in decarboxylation and deamination reactions. For aliphatic polymers, •OH attack resulted in the formation of carbonyl groups from peracid, aldehyde, or ketone via H abstraction and C-C scission. Moreover, ¹O₂ might participate in phenyl ring opening for PS NPs and coating degradation for coated PS NPs. This study facilitates understanding the ROS-induced weathering process of NPs/MPs in water under UV irradiation.Keywords: microplastics, nanoplastics, photoaging, reactive oxygen species, surface coating
Procedia PDF Downloads 1604115 Detecting Cyberbullying, Spam and Bot Behavior and Fake News in Social Media Accounts Using Machine Learning
Authors: M. D. D. Chathurangi, M. G. K. Nayanathara, K. M. H. M. M. Gunapala, G. M. R. G. Dayananda, Kavinga Yapa Abeywardena, Deemantha Siriwardana
Abstract:
Due to the growing popularity of social media platforms at present, there are various concerns, mostly cyberbullying, spam, bot accounts, and the spread of incorrect information. To develop a risk score calculation system as a thorough method for deciphering and exposing unethical social media profiles, this research explores the most suitable algorithms to our best knowledge in detecting the mentioned concerns. Various multiple models, such as Naïve Bayes, CNN, KNN, Stochastic Gradient Descent, Gradient Boosting Classifier, etc., were examined, and the best results were taken into the development of the risk score system. For cyberbullying, the Logistic Regression algorithm achieved an accuracy of 84.9%, while the spam-detecting MLP model gained 98.02% accuracy. The bot accounts identifying the Random Forest algorithm obtained 91.06% accuracy, and 84% accuracy was acquired for fake news detection using SVM.Keywords: cyberbullying, spam behavior, bot accounts, fake news, machine learning
Procedia PDF Downloads 404114 Modified 'Perturb and Observe' with 'Incremental Conductance' Algorithm for Maximum Power Point Tracking
Authors: H. Fuad Usman, M. Rafay Khan Sial, Shahzaib Hamid
Abstract:
The trend of renewable energy resources has been amplified due to global warming and other environmental related complications in the 21st century. Recent research has very much emphasized on the generation of electrical power through renewable resources like solar, wind, hydro, geothermal, etc. The use of the photovoltaic cell has become very public as it is very useful for the domestic and commercial purpose overall the world. Although a single cell gives the low voltage output but connecting a number of cells in a series formed a complete module of the photovoltaic cells, it is becoming a financial investment as the use of it fetching popular. This also reduced the prices of the photovoltaic cell which gives the customers a confident of using this source for their electrical use. Photovoltaic cell gives the MPPT at single specific point of operation at a given temperature and level of solar intensity received at a given surface whereas the focal point changes over a large range depending upon the manufacturing factor, temperature conditions, intensity for insolation, instantaneous conditions for shading and aging factor for the photovoltaic cells. Two improved algorithms have been proposed in this article for the MPPT. The widely used algorithms are the ‘Incremental Conductance’ and ‘Perturb and Observe’ algorithms. To extract the maximum power from the source to the load, the duty cycle of the convertor will be effectively controlled. After assessing the previous techniques, this paper presents the improved and reformed idea of harvesting maximum power point from the photovoltaic cells. A thoroughly go through of the previous ideas has been observed before constructing the improvement in the traditional technique of MPP. Each technique has its own importance and boundaries at various weather conditions. An improved technique of implementing the use of both ‘Perturb and Observe’ and ‘Incremental Conductance’ is introduced.Keywords: duty cycle, MPPT (Maximum Power Point Tracking), perturb and observe (P&O), photovoltaic module
Procedia PDF Downloads 1784113 Amine Hardeners with Carbon Nanotubes Dispersing Ability for Epoxy Coating Systems
Authors: Szymon Kugler, Krzysztof Kowalczyk, Tadeusz Spychaj
Abstract:
An addition of carbon nanotubes (CNT) can simultaneously improve many features of epoxy coatings, i.e. electrical, mechanical, functional and thermal. Unfortunately, this nanofiller negatively affects visual properties of the coatings, such as transparency and gloss. The main reason for the low visual performance of CNT-modified epoxy coatings is the lack of compatibility between CNT and popular amine curing agents, although epoxy resins based on bisphenol A are indisputable good CNT dispersants. This is a serious obstacle in utilization of the coatings in advanced applications, demanding both high transparency and electrical conductivity. The aim of performed investigations was to find amine curing agents exhibiting affinity for CNT, and ensuring good performance of epoxy coatings with them. Commercially available CNT was dispersed in epoxy resin, as well as in different aliphatic, cycloaliphatic and aromatic amines, using one of two dispergation methods: ultrasonic or mechanical. The CNT dispersions were subsequently used in the preparation of epoxy coating compositions and coatings on a transparent substrate. It was found that amine derivative of bio-based cardanol, as well as modified o-tolylbiguanide exhibit significant CNT, dispersing properties, resulting in improved transparent/electroconductive performance of epoxy coatings. In one of prepared coating systems just 0.025 wt.% (250 ppm) of CNT was enough to obtain coatings with semi conductive properties, 83% of transparency as well as perfect chemical resistance to methyl-ethyl ketone and improved thermal stability. Additionally, a theory of the influence of amine chemical structure on CNT dispersing properties was proposed.Keywords: bio-based cardanol, carbon nanotubes, epoxy coatings, tolylbiguanide
Procedia PDF Downloads 2124112 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest
Authors: Bharatendra Rai
Abstract:
Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error
Procedia PDF Downloads 3244111 Effect of Temperature and CuO Nanoparticle Concentration on Thermal Conductivity and Viscosity of a Phase Change Material
Authors: V. Bastian Aguila, C. Diego Vasco, P. Paula Galvez, R. Paula Zapata
Abstract:
The main results of an experimental study of the effect of temperature and nanoparticle concentration on thermal conductivity and viscosity of a nanofluid are shown. The nanofluid was made by using octadecane as a base fluid and CuO spherical nanoparticles of 75 nm (MkNano). Since the base fluid is a phase change material (PCM) to be used in thermal storage applications, the engineered nanofluid is referred as nanoPCM. Three nanoPCM were prepared through the two-step method (2.5, 5.0 and 10.0%wv). In order to increase the stability of the nanoPCM, the surface of the CuO nanoparticles was modified with sodium oleate, and it was verified by IR analysis. The modified CuO nanoparticles were dispersed by using an ultrasonic horn (Hielscher UP50H) during one hour (amplitude of 180 μm at 50 W). The thermal conductivity was measured by using a thermal properties analyzer (KD2-Pro) in the temperature range of 30ºC to 40ºC. The viscosity was measured by using a Brookfield DV2T-LV viscosimeter to 30 RPM in the temperature range of 30ºC to 55ºC. The obtained results for the nanoPCM showed that thermal conductivity is almost constant in the analyzed temperature range, and the viscosity decreases non-linearly with temperature. Respect to the effect of the nanoparticle concentration, both thermal conductivity and viscosity increased with nanoparticle concentration. The thermal conductivity raised up to 9% respect to the base fluid, and the viscosity increases up to 60%, in both cases for the higher concentration. Finally, the viscosity measurements for different rotation speeds (30 RPM - 80 RPM) exhibited that the addition of nanoparticles modifies the rheological behavior of the base fluid, from a Newtonian to a viscoplastic (Bingham) or shear thinning (power-law) non-Newtonian behavior.Keywords: NanoPCM, thermal conductivity, viscosity, non-Newtonian fluid
Procedia PDF Downloads 4204110 Systems Engineering Management Using Transdisciplinary Quality System Development Lifecycle Model
Authors: Mohamed Asaad Abdelrazek, Amir Taher El-Sheikh, M. Zayan, A.M. Elhady
Abstract:
The successful realization of complex systems is dependent not only on the technology issues and the process for implementing them, but on the management issues as well. Managing the systems development lifecycle requires technical management. Systems engineering management is the technical management. Systems engineering management is accomplished by incorporating many activities. The three major activities are development phasing, systems engineering process and lifecycle integration. Systems engineering management activities are performed across the system development lifecycle. Due to the ever-increasing complexity of systems as well the difficulty of managing and tracking the development activities, new ways to achieve systems engineering management activities are required. This paper presents a systematic approach used as a design management tool applied across systems engineering management roles. In this approach, Transdisciplinary System Development Lifecycle (TSDL) Model has been modified and integrated with Quality Function Deployment. Hereinafter, the name of the systematic approach is the Transdisciplinary Quality System Development Lifecycle (TQSDL) Model. The QFD translates the voice of customers (VOC) into measurable technical characteristics. The modified TSDL model is based on Axiomatic Design developed by Suh which is applicable to all designs: products, processes, systems and organizations. The TQSDL model aims to provide a robust structure and systematic thinking to support the implementation of systems engineering management roles. This approach ensures that the customer requirements are fulfilled as well as satisfies all the systems engineering manager roles and activities.Keywords: axiomatic design, quality function deployment, systems engineering management, system development lifecycle
Procedia PDF Downloads 3644109 Low-Cost Parking Lot Mapping and Localization for Home Zone Parking Pilot
Authors: Hongbo Zhang, Xinlu Tang, Jiangwei Li, Chi Yan
Abstract:
Home zone parking pilot (HPP) is a fast-growing segment in low-speed autonomous driving applications. It requires the car automatically cruise around a parking lot and park itself in a range of up to 100 meters inside a recurrent home/office parking lot, which requires precise parking lot mapping and localization solution. Although Lidar is ideal for SLAM, the car OEMs favor a low-cost fish-eye camera based visual SLAM approach. Recent approaches have employed segmentation models to extract semantic features and improve mapping accuracy, but these AI models are memory unfriendly and computationally expensive, making deploying on embedded ADAS systems difficult. To address this issue, we proposed a new method that utilizes object detection models to extract robust and accurate parking lot features. The proposed method could reduce computational costs while maintaining high accuracy. Once combined with vehicles’ wheel-pulse information, the system could construct maps and locate the vehicle in real-time. This article will discuss in detail (1) the fish-eye based Around View Monitoring (AVM) with transparent chassis images as the inputs, (2) an Object Detection (OD) based feature point extraction algorithm to generate point cloud, (3) a low computational parking lot mapping algorithm and (4) the real-time localization algorithm. At last, we will demonstrate the experiment results with an embedded ADAS system installed on a real car in the underground parking lot.Keywords: ADAS, home zone parking pilot, object detection, visual SLAM
Procedia PDF Downloads 694108 Mobile Crowdsensing Scheme by Predicting Vehicle Mobility Using Deep Learning Algorithm
Authors: Monojit Manna, Arpan Adhikary
Abstract:
In Mobile cloud sensing across the globe, an emerging paradigm is selected by the user to compute sensing tasks. In urban cities current days, Mobile vehicles are adapted to perform the task of data sensing and data collection for universality and mobility. In this work, we focused on the optimality and mobile nodes that can be selected in order to collect the maximum amount of data from urban areas and fulfill the required data in the future period within a couple of minutes. We map out the requirement of the vehicle to configure the maximum data optimization problem and budget. The Application implementation is basically set up to generalize a realistic online platform in which real-time vehicles are moving apparently in a continuous manner. The data center has the authority to select a set of vehicles immediately. A deep learning-based scheme with the help of mobile vehicles (DLMV) will be proposed to collect sensing data from the urban environment. From the future time perspective, this work proposed a deep learning-based offline algorithm to predict mobility. Therefore, we proposed a greedy approach applying an online algorithm step into a subset of vehicles for an NP-complete problem with a limited budget. Real dataset experimental extensive evaluations are conducted for the real mobility dataset in Rome. The result of the experiment not only fulfills the efficiency of our proposed solution but also proves the validity of DLMV and improves the quantity of collecting the sensing data compared with other algorithms.Keywords: mobile crowdsensing, deep learning, vehicle recruitment, sensing coverage, data collection
Procedia PDF Downloads 784107 Thermal and Mechanical Properties of Polycaprolactone-Soy Lecithin Modified Bentonite Nanocomposites
Authors: Danila Merino, Leandro N. Ludueña, Vera A. Alvarez
Abstract:
Clays are commonly used to reinforce polymeric materials. In order to modify them, long-chain quaternary-alkylammonium salts have been widely employed. However, the application of these clays in biological fields is limited by the toxicity and poor biocompatibility presented by these modifiers. Meanwhile, soy lecithin, acts as a natural biosurfactant and environment-friendly biomodifier. In this report, we analyse the effect of content of soy lecithin-modified bentonite on the properties of polycaprolactone (PCL) nanocomposites. Commercial grade PCL (CAPA FB 100) was supplied by Perstorp, with Mw = 100000 g/mol. Minarmco S.A. and Melar S.A supplied bentonite and soy lecithin, respectively. Clays with 18, 30 and 45 wt% of organic content were prepared by exchanging 4 g of Na-Bent with 1, 2 and 4 g of soy lecithin aqueous and acid solution (pH=1, with HCl) at 75ºC for 2 h. Then, they were washed and lyophilized for 72 h. Samples were labeled A, B and C. Nanocomposites with 1 and 2 wt.% of each clay were prepared by melt-intercalation followed by compression-moulding. An intensive Brabender type mixer with two counter-rotating roller rotors was used. Mixing temperature was 100 ºC; speed of rotation was 100 rpm. and mixing time was 10 min. Compression moulding was carried out in a hydraulic press under 75 Kg/mm2 for 10 minutes at 100 ºC. The thickness of the samples was about 1 mm. Thermal and mechanical properties were analysed. PCL nanocomposites with 1 and 2% of B presented the best mechanical properties. It was observed that an excessive organic content produced an increment on the rigidity of PCL, but caused a detrimental effect on the tensile strength and elongation at break of the nanocomposites. Thermogravimetrical analyses suggest that all reinforced samples have higher resistance to degradation than neat PCL.Keywords: chemical modification, clay, nanocomposite, characterization
Procedia PDF Downloads 2014106 Chemical, Physical and Microbiological Characteristics of a Texture-Modified Beef- Based 3D Printed Functional Product
Authors: Elvan G. Bulut, Betul Goksun, Tugba G. Gun, Ozge Sakiyan Demirkol, Kamuran Ayhan, Kezban Candogan
Abstract:
Dysphagia, difficulty in swallowing solid foods and thin liquids, is one of the common health threats among the elderly who require foods with modified texture in their diet. Although there are some commercial food formulations or hydrocolloids to thicken the liquid foods for dysphagic individuals, there is still a need for developing and offering new food products with enriched nutritional, textural and sensory characteristics to safely nourish these patients. 3D food printing is an appealing alternative in creating personalized foods for this purpose with attractive shape, soft and homogenous texture. In order to modify texture and prevent phase separation, hydrocolloids are generally used. In our laboratory, an optimized 3D printed beef-based formulation specifically for people with swallowing difficulties was developed based on the research project supported by the Scientific and Technological Research Council of Turkey (TÜBİTAK Project # 218O017). The optimized formulation obtained from response surface methodology was 60% beef powder, 5.88% gelatin, and 0.74% kappa-carrageenan (all in a dry basis). This product was enriched with powders of freeze-dried beet, celery, and red capia pepper, butter, and whole milk. Proximate composition (moisture, fat, protein, and ash contents), pH value, CIE lightness (L*), redness (a*) and yellowness (b*), and color difference (ΔE*) values were determined. Counts of total mesophilic aerobic bacteria (TMAB), lactic acid bacteria (LAB), mold and yeast, total coliforms were conducted, and detection of coagulase positive S. aureus, E. coli, and Salmonella spp. were performed. The 3D printed products had 60.11% moisture, 16.51% fat, 13.68% protein, and 1.65% ash, and the pH value was 6.19, whereas the ΔE* value was 3.04. Counts of TMAB, LAB, mold and yeast and total coliforms before and after 3D printing were 5.23-5.41 log cfu/g, < 1 log cfu/g, < 1 log cfu/g, 2.39-2.15 log EMS/g, respectively. Coagulase positive S. aureus, E. coli, and Salmonella spp. were not detected in the products. The data obtained from this study based on determining some important product characteristics of functional beef-based formulation provides an encouraging basis for future research on the subject and should be useful in designing mass production of 3D printed products of similar composition.Keywords: beef, dysphagia, product characteristics, texture-modified foods, 3D food printing
Procedia PDF Downloads 1114105 Performance Evaluation of Various Segmentation Techniques on MRI of Brain Tissue
Authors: U.V. Suryawanshi, S.S. Chowhan, U.V Kulkarni
Abstract:
Accuracy of segmentation methods is of great importance in brain image analysis. Tissue classification in Magnetic Resonance brain images (MRI) is an important issue in the analysis of several brain dementias. This paper portraits performance of segmentation techniques that are used on Brain MRI. A large variety of algorithms for segmentation of Brain MRI has been developed. The objective of this paper is to perform a segmentation process on MR images of the human brain, using Fuzzy c-means (FCM), Kernel based Fuzzy c-means clustering (KFCM), Spatial Fuzzy c-means (SFCM) and Improved Fuzzy c-means (IFCM). The review covers imaging modalities, MRI and methods for noise reduction and segmentation approaches. All methods are applied on MRI brain images which are degraded by salt-pepper noise demonstrate that the IFCM algorithm performs more robust to noise than the standard FCM algorithm. We conclude with a discussion on the trend of future research in brain segmentation and changing norms in IFCM for better results.Keywords: image segmentation, preprocessing, MRI, FCM, KFCM, SFCM, IFCM
Procedia PDF Downloads 3344104 Location Management in Wireless Sensor Networks with Mobility
Authors: Amrita Anil Agashe, Sumant Tapas, Ajay Verma Yogesh Sonavane, Sourabh Yeravar
Abstract:
Due to advancement in MEMS technology today wireless sensors network has gained a lot of importance. The wide range of its applications includes environmental and habitat monitoring, object localization, target tracking, security surveillance etc. Wireless sensor networks consist of tiny sensor devices called as motes. The constrained computation power, battery power, storage capacity and communication bandwidth of the tiny motes pose challenging problems in the design and deployment of such systems. In this paper, we propose a ubiquitous framework for Real-Time Tracking, Sensing and Management System using IITH motes. Also, we explain the algorithm that we have developed for location management in wireless sensor networks with the aspect of mobility. Our developed framework and algorithm can be used to detect emergency events and safety threats and provides warning signals to handle the emergency.Keywords: mobility management, motes, multihop, wireless sensor networks
Procedia PDF Downloads 4214103 Study on Sharp V-Notch Problem under Dynamic Loading Condition Using Symplectic Analytical Singular Element
Authors: Xiaofei Hu, Zhiyu Cai, Weian Yao
Abstract:
V-notch problem under dynamic loading condition is considered in this paper. In the time domain, the precise time domain expanding algorithm is employed, in which a self-adaptive technique is carried out to improve computing accuracy. By expanding variables in each time interval, the recursive finite element formulas are derived. In the space domain, a Symplectic Analytical Singular Element (SASE) for V-notch problem is constructed addressing the stress singularity of the notch tip. Combining with the conventional finite elements, the proposed SASE can be used to solve the dynamic stress intensity factors (DSIFs) in a simple way. Numerical results show that the proposed SASE for V-notch problem subjected to dynamic loading condition is effective and efficient.Keywords: V-notch, dynamic stress intensity factor, finite element method, precise time domain expanding algorithm
Procedia PDF Downloads 1744102 Kou Jump Diffusion Model: An Application to the SP 500; Nasdaq 100 and Russell 2000 Index Options
Authors: Wajih Abbassi, Zouhaier Ben Khelifa
Abstract:
The present research points towards the empirical validation of three options valuation models, the ad-hoc Black-Scholes model as proposed by Berkowitz (2001), the constant elasticity of variance model of Cox and Ross (1976) and the Kou jump-diffusion model (2002). Our empirical analysis has been conducted on a sample of 26,974 options written on three indexes, the S&P 500, Nasdaq 100 and the Russell 2000 that were negotiated during the year 2007 just before the sub-prime crisis. We start by presenting the theoretical foundations of the models of interest. Then we use the technique of trust-region-reflective algorithm to estimate the structural parameters of these models from cross-section of option prices. The empirical analysis shows the superiority of the Kou jump-diffusion model. This superiority arises from the ability of this model to portray the behavior of market participants and to be closest to the true distribution that characterizes the evolution of these indices. Indeed the double-exponential distribution covers three interesting properties that are: the leptokurtic feature, the memory less property and the psychological aspect of market participants. Numerous empirical studies have shown that markets tend to have both overreaction and under reaction over good and bad news respectively. Despite of these advantages there are not many empirical studies based on this model partly because probability distribution and option valuation formula are rather complicated. This paper is the first to have used the technique of nonlinear curve-fitting through the trust-region-reflective algorithm and cross-section options to estimate the structural parameters of the Kou jump-diffusion model.Keywords: jump-diffusion process, Kou model, Leptokurtic feature, trust-region-reflective algorithm, US index options
Procedia PDF Downloads 4294101 Intelligent Algorithm-Based Tool-Path Planning and Optimization for Additive Manufacturing
Authors: Efrain Rodriguez, Sergio Pertuz, Cristhian Riano
Abstract:
Tool-path generation is an essential step in the FFF (Fused Filament Fabrication)-based Additive Manufacturing (AM) process planning. In the manufacture of a mechanical part by using additive processes, high resource consumption and prolonged production times are inherent drawbacks of these processes mainly due to non-optimized tool-path generation. In this work, we propose a heuristic-search intelligent algorithm-based approach for optimized tool-path generation for FFF-based AM. The main benefit of this approach is a significant reduction of travels without material deposition when the AM machine performs moves without any extrusion. The optimization method used reduces the number of travels without extrusion in comparison with commercial software as Slic3r or Cura Engine, which means a reduction of production time.Keywords: additive manufacturing, tool-path optimization, fused filament fabrication, process planning
Procedia PDF Downloads 4434100 Smooth Second Order Nonsingular Terminal Sliding Mode Control for a 6 DOF Quadrotor UAV
Authors: V. Tabrizi, A. Vali, R. GHasemi, V. Behnamgol
Abstract:
In this article, a nonlinear model of an under actuated six degrees of freedom (6 DOF) quadrotor UAV is derived on the basis of the Newton-Euler formula. The derivation comprises determining equations of the motion of the quadrotor in three dimensions and approximating the actuation forces through the modeling of aerodynamic coefficients and electric motor dynamics. The robust nonlinear control strategy includes a smooth second order non-singular terminal sliding mode control which is applied to stabilizing this model. The control method is on the basis of super twisting algorithm for removing the chattering and producing smooth control signal. Also, nonsingular terminal sliding mode idea is used for introducing a nonlinear sliding variable that guarantees the finite time convergence in sliding phase. Simulation results show that the proposed algorithm is robust against uncertainty or disturbance and guarantees a fast and precise control signal.Keywords: quadrotor UAV, nonsingular terminal sliding mode, second order sliding mode t, electronics, control, signal processing
Procedia PDF Downloads 4424099 Privacy-Preserving Model for Social Network Sites to Prevent Unwanted Information Diffusion
Authors: Sanaz Kavianpour, Zuraini Ismail, Bharanidharan Shanmugam
Abstract:
Social Network Sites (SNSs) can be served as an invaluable platform to transfer the information across a large number of individuals. A substantial component of communicating and managing information is to identify which individual will influence others in propagating information and also whether dissemination of information in the absence of social signals about that information will be occurred or not. Classifying the final audience of social data is difficult as controlling the social contexts which transfers among individuals are not completely possible. Hence, undesirable information diffusion to an unauthorized individual on SNSs can threaten individuals’ privacy. This paper highlights the information diffusion in SNSs and moreover it emphasizes the most significant privacy issues to individuals of SNSs. The goal of this paper is to propose a privacy-preserving model that has urgent regards with individuals’ data in order to control availability of data and improve privacy by providing access to the data for an appropriate third parties without compromising the advantages of information sharing through SNSs.Keywords: anonymization algorithm, classification algorithm, information diffusion, privacy, social network sites
Procedia PDF Downloads 3214098 Hardware Implementation and Real-time Experimental Validation of a Direction of Arrival Estimation Algorithm
Authors: Nizar Tayem, AbuMuhammad Moinuddeen, Ahmed A. Hussain, Redha M. Radaydeh
Abstract:
This research paper introduces an approach for estimating the direction of arrival (DOA) of multiple RF noncoherent sources in a uniform linear array (ULA). The proposed method utilizes a Capon-like estimation algorithm and incorporates LU decomposition to enhance the accuracy of DOA estimation while significantly reducing computational complexity compared to existing methods like the Capon method. Notably, the proposed method does not require prior knowledge of the number of sources. To validate its effectiveness, the proposed method undergoes validation through both software simulations and practical experimentation on a prototype testbed constructed using a software-defined radio (SDR) platform and GNU Radio software. The results obtained from MATLAB simulations and real-time experiments provide compelling evidence of the proposed method's efficacy.Keywords: DOA estimation, real-time validation, software defined radio, computational complexity, Capon's method, GNU radio
Procedia PDF Downloads 754097 Serum Levels of Carnitine in Multiple Sclerosis Patients in Comparison with Healthy People and its Association with Fatigue Severity
Authors: Mohammad Hossein Harirchian, Siavash Babaie, Nika keshtkaran, Sama Bitarafan
Abstract:
Background: Fatigue is a common complaint of multiple sclerosis (MS) patients, adversely affecting their quality of life. There is a lot of evidence showing that Carnitine deficiency is linked to fatigue development and severity in some conditions. This study aimed to compare the levels of Free L-Carnitine (FLC) between MS patients and healthy people and evaluate its association with the severity of fatigue. Methods: This case-control study included 30 patients with relapsing-remitting MS (RRMS) in 2 sex-matched equal-number groups according to the presence or absence of fatigue and 30 sex-matched healthy people in the control group. In addition, between two patient groups, we compared Serum level of FLC between the patient and healthy group. Fatigue was scored using two valid questionnaires of fatigue Severity Scale (FSS) and Modified Fatigue Impact Scale (MFIS). In addition, association between Serum level of FLC and fatigue severity was evaluated in MS patients. Results: There was no significant difference in serum levels of FLC between MS patients and healthy people. The patients with fatigue had a significantly lower FLC (mg/dl) value than patients without fatigue (22.53 ± 15.84 vs. 75.36 ± 51.98, P < 0.001). The mean value of FSS and MFIS in patients with fatigue were 48.80±8.55 and 62.87 ± 13.63, respectively, which was nearly two-fold higher than group without fatigue (P < 0.001). There was a negative correlation between the serum level of FLC and fatigue severity scales (Spearman rank correlation= 0.76, P < 0.001). Conclusion: We showed healthy people and MS patients were not different in levels of FLC. In addition, patients with lower serum levels of FLC might experience more severe fatigue. Therefore, this could clarify that supplementation with L-Carnitine might be considered as a complementary treatment for MS-related fatigue.Keywords: fatigue, multiple sclerosis, L-carnitine, modified fatigue impact scale
Procedia PDF Downloads 1404096 Expert Supporting System for Diagnosing Lymphoid Neoplasms Using Probabilistic Decision Tree Algorithm and Immunohistochemistry Profile Database
Authors: Yosep Chong, Yejin Kim, Jingyun Choi, Hwanjo Yu, Eun Jung Lee, Chang Suk Kang
Abstract:
For the past decades, immunohistochemistry (IHC) has been playing an important role in the diagnosis of human neoplasms, by helping pathologists to make a clearer decision on differential diagnosis, subtyping, personalized treatment plan, and finally prognosis prediction. However, the IHC performed in various tumors of daily practice often shows conflicting and very challenging results to interpret. Even comprehensive diagnosis synthesizing clinical, histologic and immunohistochemical findings can be helpless in some twisted cases. Another important issue is that the IHC data is increasing exponentially and more and more information have to be taken into account. For this reason, we reached an idea to develop an expert supporting system to help pathologists to make a better decision in diagnosing human neoplasms with IHC results. We gave probabilistic decision tree algorithm and tested the algorithm with real case data of lymphoid neoplasms, in which the IHC profile is more important to make a proper diagnosis than other human neoplasms. We designed probabilistic decision tree based on Bayesian theorem, program computational process using MATLAB (The MathWorks, Inc., USA) and prepared IHC profile database (about 104 disease category and 88 IHC antibodies) based on WHO classification by reviewing the literature. The initial probability of each neoplasm was set with the epidemiologic data of lymphoid neoplasm in Korea. With the IHC results of 131 patients sequentially selected, top three presumptive diagnoses for each case were made and compared with the original diagnoses. After the review of the data, 124 out of 131 were used for final analysis. As a result, the presumptive diagnoses were concordant with the original diagnoses in 118 cases (93.7%). The major reason of discordant cases was that the similarity of the IHC profile between two or three different neoplasms. The expert supporting system algorithm presented in this study is in its elementary stage and need more optimization using more advanced technology such as deep-learning with data of real cases, especially in differentiating T-cell lymphomas. Although it needs more refinement, it may be used to aid pathological decision making in future. A further application to determine IHC antibodies for a certain subset of differential diagnoses might be possible in near future.Keywords: database, expert supporting system, immunohistochemistry, probabilistic decision tree
Procedia PDF Downloads 225