Search results for: algorithm techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9585

Search results for: algorithm techniques

6855 Development of Self Emulsifying Drug Delivery Systems (SEDDS) of Anticancer Agents Used in AYUSH System of Medicine for Improved Oral Bioavailability Followed by Their Pharmacological Evaluation Using Biotechnological Techniques

Authors: Meenu Mehta, Munish Garg

Abstract:

The use of oral anticancer drugs from AYUSH system of medicine is widely increased among the society due to their low cost, enhanced efficacy, increased patient preference, lack of inconveniences related to infusion and they provide an opportunity to develop chronic treatment regimens. However, oral delivery of these drugs usually laid down by the limited bioavailability of the drug, which is associated with a wide variation. As most of the cytotoxic agents have a narrow therapeutic window and are dosed at or near the maximum tolerated dose, a wide variability in the bioavailability can negatively affect treatment result. It is estimated that 40% of active substances are poorly soluble in water. The improvement of bio-availability of drugs with such properties presents one of the greatest challenges in drug formulations. There are several techniques reported in literature. Among all these Self Emulsifying Drug Delivery System (SEDDS) has gained more attention due to enhanced oral bio-availability enabling a reduction in dose. Thus, SEDDS anticancer drugs will have the increased bioavailability and efficacy. These dosage form will provide societal benefit in a cost-effective manner as compared to other oral dosage forms. Present study reflects on the formulation strategies as SEDDS for oral anticancer agents of AYUSH system for enhanced bioavailability with proven efficacy by cancer cell lines.

Keywords: anticancer agents, AYUSH system, bioavailability, SEDDS

Procedia PDF Downloads 286
6854 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images

Authors: Amit Kumar Happy

Abstract:

This paper is motivated by the importance of multi-sensor image fusion with a specific focus on infrared (IR) and visual image (VI) fusion for various applications, including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like visible camera & IR thermal imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (infrared) that may be reflected or self-emitted. A digital color camera captures the visible source image, and a thermal infrared camera acquires the thermal source image. In this paper, some image fusion algorithms based upon multi-scale transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes the implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, they also make it hard to become deployed in systems and applications that require a real-time operation, high flexibility, and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.

Keywords: image fusion, IR thermal imager, multi-sensor, multi-scale transform

Procedia PDF Downloads 97
6853 Procedure to Optimize the Performance of Chemical Laser Using the Genetic Algorithm Optimizations

Authors: Mohammedi Ferhate

Abstract:

This work presents details of the study of the entire flow inside the facility where the exothermic chemical reaction process in the chemical laser cavity is analyzed. In our paper we will describe the principles of chemical lasers where flow reversal is produced by chemical reactions. We explain the device for converting chemical potential energy laser energy. We see that the phenomenon thus has an explosive trend. Finally, the feasibility and effectiveness of the proposed method is demonstrated by computer simulation

Keywords: genetic, lasers, nozzle, programming

Procedia PDF Downloads 79
6852 Ab-initio Calculations on the Mechanism of Action of Platinum and Ruthenium Complexes in Phototherapy

Authors: Eslam Dabbish, Fortuna Ponte, Stefano Scoditti, Emilia Sicilia, Gloria Mazzone

Abstract:

The medical techniques based on the use of light for activating the drug are occupying a prominent place in the cancer treatment due to their selectivity that contributes to reduce undesirable side effects of conventional chemotherapy. Among these therapeutic treatments, photodynamic therapy (PDT) and photoactivated chemotherapy (PACT) are emerging as complementary approaches for selective destruction of neoplastic tissue through direct cellular damage. Both techniques rely on the employment of a molecule, photosensitizer (PS), able to absorb within the so-called therapeutic window. Thus, the exposure to light of otherwise inert molecules promotes the population of excited states of the drug, that in PDT are able to produce the cytotoxic species, such as 1O2 and other ROS, in PACT can be responsible of the active species release or formation. Following the success of cisplatin in conventional treatments, many other transition metal complexes were explored as anticancer agents for applications in different medical approaches, including PDT and PACT, in order to improve their chemical, biological and photophysical properties. In this field, several crucial characteristics of candidate PSs can be accurately predicted from first principle calculations, especially in the framework of density functional theory and its time-dependent formulation, contributing to the understanding of the entire photochemical pathways involved which can ultimately help in improving the efficiency of a drug. A brief overview of the outcomes on some platinum and ruthenium-based PSs proposed for the application in the two phototherapies will be provided.

Keywords: TDDFT, metal complexes, PACT, PDT

Procedia PDF Downloads 84
6851 Impact of Combined Heat and Power (CHP) Generation Technology on Distribution Network Development

Authors: Sreto Boljevic

Abstract:

In the absence of considerable investment in electricity generation, transmission and distribution network (DN) capacity, the demand for electrical energy will quickly strain the capacity of the existing electrical power network. With anticipated growth and proliferation of Electric vehicles (EVs) and Heat pump (HPs) identified the likelihood that the additional load from EV changing and the HPs operation will require capital investment in the DN. While an area-wide implementation of EVs and HPs will contribute to the decarbonization of the energy system, they represent new challenges for the existing low-voltage (LV) network. Distributed energy resources (DER), operating both as part of the DN and in the off-network mode, have been offered as a means to meet growing electricity demand while maintaining and ever-improving DN reliability, resiliency and power quality. DN planning has traditionally been done by forecasting future growth in demand and estimating peak load that the network should meet. However, new problems are arising. These problems are associated with a high degree of proliferation of EVs and HPs as load imposes on DN. In addition to that, the promotion of electricity generation from renewable energy sources (RES). High distributed generation (DG) penetration and a large increase in load proliferation at low-voltage DNs may have numerous impacts on DNs that create issues that include energy losses, voltage control, fault levels, reliability, resiliency and power quality. To mitigate negative impacts and at a same time enhance positive impacts regarding the new operational state of DN, CHP system integration can be seen as best action to postpone/reduce capital investment needed to facilitate promotion and maximize benefits of EVs, HPs and RES integration in low-voltage DN. The aim of this paper is to generate an algorithm by using an analytical approach. Algorithm implementation will provide a way for optimal placement of the CHP system in the DN in order to maximize the integration of RES and increase in proliferation of EVs and HPs.

Keywords: combined heat & power (CHP), distribution networks, EVs, HPs, RES

Procedia PDF Downloads 186
6850 Controlling Cocoa Pod Borer, Conopomorpha cramerella (Snell.) and Cost Analysis Production at Cacao Plantation

Authors: Alam Anshary, Flora Pasaru, Shahabuddin

Abstract:

The Cocoa Pod Borer (CPB), Conopomorpha cramerella (Snell.) is present on most of the larger cocoa producing islands in Indonesia. Various control measures CPB has been carried out by the farmers, but the results have not been effective. This study aims to determine the effect of application of Beauveria bassiana treatments and pruning technique to the control of CPB in the cocoa plantation people. Research using completely randomized design with 4 treatments and 3 replications, treatment consists of B.bassiana, Pruning, B. bassiana+pruning (Bb + Pr), as well as the control. The results showed that the percentage of PBK attack on cocoa pods in treatment (Bb + Pr) 3.50% the lowest compared to other treatments. CPB attack percentage in treatment B.bassiana 6.15%; pruning 8.75%, and 15.20% control. Results of the analysis of production estimates, the known treatments (Bb + Pr) have the highest production (1.95 tonnes / ha). The model results estimated production is Y= 0,20999 + 0,53968X1 + 0,34298X2+ 0,31410X3 + 0,35629X4 + 0,08345X5 + 0,29732X6. Farm production costs consist of fixed costs and variable costs, fixed costs are costs incurred by the farmer that the size does not affect the results, such as taxes and depreciation of production equipment. Variable costs are costs incurred by farmers who used up in one year cocoa farming activities. The cost of production in farming cocoa without integrated techniques control of CPB is Rp. 9.205.550 million/ha, while the cost of production with integrated techniques control is Rp. 6.666.050 million/ha.

Keywords: cacao, cocoa pod borer, pruning, Beauveria bassiana, production costs

Procedia PDF Downloads 265
6849 Cooperative Agents to Prevent and Mitigate Distributed Denial of Service Attacks of Internet of Things Devices in Transportation Systems

Authors: Borhan Marzougui

Abstract:

Road and Transport Authority (RTA) is moving ahead with the implementation of the leader’s vision in exploring all avenues that may bring better security and safety services to the community. Smart transport means using smart technologies such as IoT (Internet of Things). This technology continues to affirm its important role in the context of Information and Transportation Systems. In fact, IoT is a network of Internet-connected objects able to collect and exchange different data using embedded sensors. With the growth of IoT, Distributed Denial of Service (DDoS) attacks is also growing exponentially. DDoS attacks are the major and a real threat to various transportation services. Currently, the defense mechanisms are mainly passive in nature, and there is a need to develop a smart technique to handle them. In fact, new IoT devices are being used into a botnet for DDoS attackers to accumulate for attacker purposes. The aim of this paper is to provide a relevant understanding of dangerous types of DDoS attack related to IoT and to provide valuable guidance for the future IoT security method. Our methodology is based on development of the distributed algorithm. This algorithm manipulates dedicated intelligent and cooperative agents to prevent and to mitigate DDOS attacks. The proposed technique ensure a preventive action when a malicious packets start to be distributed through the connected node (Network of IoT devices). In addition, the devices such as camera and radio frequency identification (RFID) are connected within the secured network, and the data generated by it are analyzed in real time by intelligent and cooperative agents. The proposed security system is based on a multi-agent system. The obtained result has shown a significant reduction of a number of infected devices and enhanced the capabilities of different security dispositives.

Keywords: IoT, DDoS, attacks, botnet, security, agents

Procedia PDF Downloads 127
6848 An Investigation Enhancing E-Voting Application Performance

Authors: Aditya Verma

Abstract:

E-voting using blockchain provides us with a distributed system where data is present on each node present in the network and is reliable and secure too due to its immutability property. This work compares various blockchain consensus algorithms used for e-voting applications in the past, based on performance and node scalability, and chooses the optimal one and improves on one such previous implementation by proposing solutions for the loopholes of the optimally working blockchain consensus algorithm, in our chosen application, e-voting.

Keywords: blockchain, parallel bft, consensus algorithms, performance

Procedia PDF Downloads 155
6847 Cyberfraud Schemes: Modus Operandi, Tools and Techniques and the Role of European Legislation as a Defense Strategy

Authors: Papathanasiou Anastasios, Liontos George, Liagkou Vasiliki, Glavas Euripides

Abstract:

The purpose of this paper is to describe the growing problem of various cyber fraud schemes that exist on the internet and are currently among the most prevalent. The main focus of this paper is to provide a detailed description of the modus operandi, tools, and techniques utilized in four basic typologies of cyber frauds: Business Email Compromise (BEC) attacks, investment fraud, romance scams, and online sales fraud. The paper aims to shed light on the methods employed by cybercriminals in perpetrating these types of fraud, as well as the strategies they use to deceive and victimize individuals and businesses on the internet. Furthermore, this study outlines defense strategies intended to tackle the issue head-on, with a particular emphasis on the crucial role played by European Legislation. European legislation has proactively adapted to the evolving landscape of cyber fraud, striving to enhance cybersecurity awareness, bolster user education, and implement advanced technical controls to mitigate associated risks. The paper evaluates the advantages and innovations brought about by the European Legislation while also acknowledging potential flaws that cybercriminals might exploit. As a result, recommendations for refining the legislation are offered in this study in order to better address this pressing issue.

Keywords: business email compromise, cybercrime, European legislation, investment fraud, NIS, online sales fraud, romance scams

Procedia PDF Downloads 80
6846 Dynamic Conformal Arc versus Intensity Modulated Radiotherapy for Image Guided Stereotactic Radiotherapy of Cranial Lesion

Authors: Chor Yi Ng, Christine Kong, Loretta Teo, Stephen Yau, FC Cheung, TL Poon, Francis Lee

Abstract:

Purpose: Dynamic conformal arc (DCA) and intensity modulated radiotherapy (IMRT) are two treatment techniques commonly used for stereotactic radiosurgery/radiotherapy of cranial lesions. IMRT plans usually give better dose conformity while DCA plans have better dose fall off. Rapid dose fall off is preferred for radiotherapy of cranial lesions, but dose conformity is also important. For certain lesions, DCA plans have good conformity, while for some lesions, the conformity is just unacceptable with DCA plans, and IMRT has to be used. The choice between the two may not be apparent until each plan is prepared and dose indices compared. We described a deviation index (DI) which is a measurement of the deviation of the target shape from a sphere, and test its functionality to choose between the two techniques. Method and Materials: From May 2015 to May 2017, our institute has performed stereotactic radiotherapy for 105 patients treating a total of 115 lesions (64 DCA plans and 51 IMRT plans). Patients were treated with the Varian Clinac iX with HDMLC. Brainlab Exactrac system was used for patient setup. Treatment planning was done with Brainlab iPlan RT Dose (Version 4.5.4). DCA plans were found to give better dose fall off in terms of R50% (R50% (DCA) = 4.75 Vs R50% (IMRT) = 5.242) while IMRT plans have better conformity in terms of treatment volume ratio (TVR) (TVR(DCA) = 1.273 Vs TVR(IMRT) = 1.222). Deviation Index (DI) is proposed to better facilitate the choice between the two techniques. DI is the ratio of the volume of a 1 mm shell of the PTV and the volume of a 1 mm shell of a sphere of identical volume. DI will be close to 1 for a near spherical PTV while a large DI will imply a more irregular PTV. To study the functionality of DI, 23 cases were chosen with PTV volume ranged from 1.149 cc to 29.83 cc, and DI ranged from 1.059 to 3.202. For each case, we did a nine field IMRT plan with one pass optimization and a five arc DCA plan. Then the TVR and R50% of each case were compared and correlated with the DI. Results: For the 23 cases, TVRs and R50% of the DCA and IMRT plans were examined. The conformity for IMRT plans are better than DCA plans, with majority of the TVR(DCA)/TVR(IMRT) ratios > 1, values ranging from 0.877 to1.538. While the dose fall off is better for DCA plans, with majority of the R50%(DCA)/ R50%(IMRT) ratios < 1. Their correlations with DI were also studied. A strong positive correlation was found between the ratio of TVRs and DI (correlation coefficient = 0.839), while the correlation between the ratio of R50%s and DI was insignificant (correlation coefficient = -0.190). Conclusion: The results suggest DI can be used as a guide for choosing the planning technique. For DI greater than a certain value, we can expect the conformity for DCA plans to become unacceptably great, and IMRT will be the technique of choice.

Keywords: cranial lesions, dynamic conformal arc, IMRT, image guided radiotherapy, stereotactic radiotherapy

Procedia PDF Downloads 229
6845 Late Bronze Age Pigments: Characterization of Mycenaean Pottery with Multi-Analytical Approach

Authors: Elif Doğru, Bülent Kızılduman, Huriye İcil

Abstract:

Throughout history, Cyprus has been involved in various commercial and cultural relationships with different civilizations, owing to its strategic location. Particularly during the Late Bronze Age, Cyprus emerged as a significant region engaged in interactions with the Mycenaeans and other Mediterranean civilizations. Presently, findings from archaeological excavations provide valuable insights into Cyprus' cultural history and its connections with other civilizations. Painted Mycenaean ceramics discovered during the excavations at Kaleburnu-Kral Tepesi (Galinaporni-Vasili), dated to the Late Bronze Age in Cyprus, are considered significant archaeological findings that carry traces of the art and culture of that era, reflecting the island's commercial and cultural connections. Considering these findings, there is a need for archaeometric studies to aid in the understanding of the commercial and cultural ties at Kaleburnu-Kral Tepesi. In line with this need, analytical studies have been initiated concerning the provenance and production techniques of the Mycenaean ceramics discovered in the excavations at Kaleburnu-Kral Tepesi, dated to the Late Bronze Age. In the context of origin analysis studies, it is advocated that understanding the techniques and materials used for the figures and designs applied on Mycenaean ceramics would significantly contribute to a better comprehension of historical contexts. Hence, the adopted approach involves not only the analysis of the ceramic raw material but also the characterization of the pigments on the ceramics as a whole. In light of this, in addition to the studies aimed at determining the provenance and production techniques of the Mycenaean ceramic bodies, the characterization of the pigments used in the decorations of the relevant ceramics has been included in the research scope. Accordingly, this study aims to characterize the pigments used in the decorations of Mycenaean ceramics discovered at Kaleburnu-Kral Tepesi, dated to the Late Bronze Age. The X-Ray diffraction (XRD), Fourier Transform Infrared Spectroscopy (FTIR), and Scanning Electron Microscopy with Energy Dispersive X-ray Spectroscopy (SEM-EDX) methods have been employed to determine the surface morphology and chemical properties of the Mycenaean pigments. The characterization has been conducted through the combination of multiple analytical methods. The characterization of the pigments of Mycenaean ceramics aims to enhance the scientific perspective adopted for understanding the contributions of Mycenaean ceramics found in Cyprus to the island's culture, by providing scientific data on the types and origins of pigments used during the Late Bronze Age.

Keywords: mycenaean, ceramic, provenance, pigment

Procedia PDF Downloads 57
6844 A Small Graphic Lie. The Photographic Quality of Pierre Bourdieu’s Correspondance Analysis

Authors: Lene Granzau Juel-Jacobsen

Abstract:

The problem of beautification is an obvious concern of photography, claiming reference to reality, but it also lies at the very heart of social theory. As we become accustomed to sophisticated visualizations of statistical data in pace with the development of software programs, we should not only be inclined to ask new types of research questions, but we also need to confront social theories based on such visualization techniques with new types of questions. Correspondence Analysis, GIS analysis, Social Network Analysis, and Perceptual Maps are current examples of visualization techniques popular within the social sciences and neighboring disciplines. This article discusses correspondence analysis, arguing that the graphic plot of correspondence analysis is to be interpreted much similarly to a photograph. It refers no more evidently or univocally to reality than a photograph, representing social life no more truthfully than a photograph documents. Pierre Bourdieu’s theoretical corpus, especially his theory of fields, relies heavily on correspondence analysis. While much attention has been directed towards critiquing the somewhat vague conceptualization of habitus, limited focus has been placed on the equally problematic concepts of social space and field. Based on a re-reading of the Distinction, the article argues that the concepts rely on ‘a small graphic lie’ very similar to a photograph. Like any other piece of art, as Bourdieu himself recognized, the graphic display is a politically and morally loaded representation technique. However, the correspondence analysis does not necessarily serve the purpose he intended. In fact, it tends towards the pitfalls he strove to overcome.

Keywords: datavisualization, correspondance analysis, bourdieu, Field, visual representation

Procedia PDF Downloads 51
6843 Hydrofracturing for Low Temperature Waxy Reservoirs: Problems and Solutions

Authors: Megh Patel, Arjun Chauhan, Jay Thakkar

Abstract:

Hydrofracturing is the most prominent but at the same time expensive, highly skilled and time consuming well stimulation technique. Due to high cost and skilled labor involved, it is generally carried out as the consummate solution among other well stimulation techniques. Considering today’s global petroleum market, no gaffe or complications could be entertained during fracturing, as it would further hamper the current dwindling economy. The literature would be dealing with the challenges encountered during fracturing low temperature waxy reservoirs and the prominent solutions to overcome such teething troubles. During fracturing treatment for, shallow and high freezing point waxy oil reservoirs, the first line problems are to overcome uncompleted breakdown, uncompleted cleanup of fracturing fluids and cold damages to the formations by injecting cold fluid (fluid at ambient conditions). Injecting fracturing fluids at ambient conditions have the tendency to decrease the near wellbore reservoir temperature below the freezing point of oil reservoir and hence leading to wax deposition around the wellbore thereby hampering the fluid production as well as fracture propagation. To overcome such problems, solutions such as hot fracturing fluid injection, encapsulated heat generating hydraulic fracturing fluid system, and injection of wax inhibitor techniques would be discussed. The paper would also be throwing light on changes in rheological properties occurred during heating fracturing fluids and solutions to deal with it taking economic considerations into account.

Keywords: hydrofracturing, waxy reservoirs, low temperature, viscosity, crosslinkers

Procedia PDF Downloads 236
6842 Panel Application for Determining Impact of Real Exchange Rate and Security on Tourism Revenues: Countries with Middle and High Level Tourism Income

Authors: M. Koray Cetin, Mehmet Mert

Abstract:

The purpose of the study is to examine impacts on tourism revenues of the exchange rate and country overall security level. There are numerous studies that examine the bidirectional relation between macroeconomic factors and tourism revenues and tourism demand. Most of the studies support the existence of impact of tourism revenues on growth rate but not vice versa. Few studies examine the impact of factors like real exchange rate or purchasing power parity on the tourism revenues. In this context, firstly impact of real exchange rate on tourism revenues examination is aimed. Because exchange rate is one of the main determinants of international tourism services price in guests currency unit. Another determinant of tourism demand for a country is country’s overall security level. This issue can be handled in the context of the relationship between tourism revenues and overall security including turmoil, terrorism, border problem, political violence. In this study, factors are handled for several countries which have tourism revenues on a certain level. With this structure, it is a panel data, and it is evaluated with panel data analysis techniques. Panel data have at least two dimensions, and one of them is time dimensions. The panel data analysis techniques are applied to data gathered from Worldbank data web page. In this study, it is expected to find impacts of real exchange rate and security factors on tourism revenues for the countries that have noteworthy tourism revenues.

Keywords: exchange rate, panel data analysis, security, tourism revenues

Procedia PDF Downloads 330
6841 Impact Location From Instrumented Mouthguard Kinematic Data In Rugby

Authors: Jazim Sohail, Filipe Teixeira-Dias

Abstract:

Mild traumatic brain injury (mTBI) within non-helmeted contact sports is a growing concern due to the serious risk of potential injury. Extensive research is being conducted looking into head kinematics in non-helmeted contact sports utilizing instrumented mouthguards that allow researchers to record accelerations and velocities of the head during and after an impact. This does not, however, allow the location of the impact on the head, and its magnitude and orientation, to be determined. This research proposes and validates two methods to quantify impact locations from instrumented mouthguard kinematic data, one using rigid body dynamics, the other utilizing machine learning. The rigid body dynamics technique focuses on establishing and matching moments from Euler’s and torque equations in order to find the impact location on the head. The methodology is validated with impact data collected from a lab test with the dummy head fitted with an instrumented mouthguard. Additionally, a Hybrid III Dummy head finite element model was utilized to create synthetic kinematic data sets for impacts from varying locations to validate the impact location algorithm. The algorithm calculates accurate impact locations; however, it will require preprocessing of live data, which is currently being done by cross-referencing data timestamps to video footage. The machine learning technique focuses on eliminating the preprocessing aspect by establishing trends within time-series signals from instrumented mouthguards to determine the impact location on the head. An unsupervised learning technique is used to cluster together impacts within similar regions from an entire time-series signal. The kinematic signals established from mouthguards are converted to the frequency domain before using a clustering algorithm to cluster together similar signals within a time series that may span the length of a game. Impacts are clustered within predetermined location bins. The same Hybrid III Dummy finite element model is used to create impacts that closely replicate on-field impacts in order to create synthetic time-series datasets consisting of impacts in varying locations. These time-series data sets are used to validate the machine learning technique. The rigid body dynamics technique provides a good method to establish accurate impact location of impact signals that have already been labeled as true impacts and filtered out of the entire time series. However, the machine learning technique provides a method that can be implemented with long time series signal data but will provide impact location within predetermined regions on the head. Additionally, the machine learning technique can be used to eliminate false impacts captured by sensors saving additional time for data scientists using instrumented mouthguard kinematic data as validating true impacts with video footage would not be required.

Keywords: head impacts, impact location, instrumented mouthguard, machine learning, mTBI

Procedia PDF Downloads 202
6840 Synthesis and Characterisation of Bio-Based Acetals Derived from Eucalyptus Oil

Authors: Kirstin Burger, Paul Watts, Nicole Vorster

Abstract:

Green chemistry focuses on synthesis which has a low negative impact on the environment. This research focuses on synthesizing novel compounds from an all-natural Eucalyptus citriodora oil. Eight novel plasticizer compounds are synthesized and optimized using flow chemistry technology. A precursor to one novel compound can be synthesized from the lauric acid present in coconut oil. Key parameters, such as catalyst screening and loading, reaction time, temperature, residence time using flow chemistry techniques is investigated. The compounds are characterised using GC-MS, FT-IR, 1H and 13C-NMR techniques, X-ray crystallography. The efficiency of the compounds is compared to two commercial plasticizers, i.e. Dibutyl phthalate and Eastman 168. Several PVC-plasticized film formulations are produced using the bio-based novel compounds. Tensile strength, stress at fracture and percentage elongation are tested. The property of having increasing plasticizer percentage in the film formulations is investigated, ranging from 3, 6, 9 and 12%. The diastereoisomers of each compound are separated and formulated into PVC films, and differences in tensile strength are measured. Leaching tests, flexibility, and change in glass transition temperatures for PVC-plasticized films is recorded. Research objective includes using these novel compounds as a green bio-plasticizer alternative in plastic products for infants. The inhibitory effect of the compounds on six pathogens effecting infants are studied, namely; Escherichia coli, Staphylococcus aureus, Shigella sonnei, Pseudomonas putida, Salmonella choleraesuis and Klebsiella oxytoca.

Keywords: bio-based compounds, plasticizer, tensile strength, microbiological inhibition , synthesis

Procedia PDF Downloads 169
6839 Preprocessing and Fusion of Multiple Representation of Finger Vein patterns using Conventional and Machine Learning techniques

Authors: Tomas Trainys, Algimantas Venckauskas

Abstract:

Application of biometric features to the cryptography for human identification and authentication is widely studied and promising area of the development of high-reliability cryptosystems. Biometric cryptosystems typically are designed for patterns recognition, which allows biometric data acquisition from an individual, extracts feature sets, compares the feature set against the set stored in the vault and gives a result of the comparison. Preprocessing and fusion of biometric data are the most important phases in generating a feature vector for key generation or authentication. Fusion of biometric features is critical for achieving a higher level of security and prevents from possible spoofing attacks. The paper focuses on the tasks of initial processing and fusion of multiple representations of finger vein modality patterns. These tasks are solved by applying conventional image preprocessing methods and machine learning techniques, Convolutional Neural Network (SVM) method for image segmentation and feature extraction. An article presents a method for generating sets of biometric features from a finger vein network using several instances of the same modality. Extracted features sets were fused at the feature level. The proposed method was tested and compared with the performance and accuracy results of other authors.

Keywords: bio-cryptography, biometrics, cryptographic key generation, data fusion, information security, SVM, pattern recognition, finger vein method.

Procedia PDF Downloads 131
6838 Optimization of Structures with Mixed Integer Non-linear Programming (MINLP)

Authors: Stojan Kravanja, Andrej Ivanič, Tomaž Žula

Abstract:

This contribution focuses on structural optimization in civil engineering using mixed integer non-linear programming (MINLP). MINLP is characterized as a versatile method that can handle both continuous and discrete optimization variables simultaneously. Continuous variables are used to optimize parameters such as dimensions, stresses, masses, or costs, while discrete variables represent binary decisions to determine the presence or absence of structural elements within a structure while also calculating discrete materials and standard sections. The optimization process is divided into three main steps. First, a mechanical superstructure with a variety of different topology-, material- and dimensional alternatives. Next, a MINLP model is formulated to encapsulate the optimization problem. Finally, an optimal solution is searched in the direction of the defined objective function while respecting the structural constraints. The economic or mass objective function of the material and labor costs of a structure is subjected to the constraints known from structural analysis. These constraints include equations for the calculation of internal forces and deflections, as well as equations for the dimensioning of structural components (in accordance with the Eurocode standards). Given the complex, non-convex and highly non-linear nature of optimization problems in civil engineering, the Modified Outer-Approximation/Equality-Relaxation (OA/ER) algorithm is applied. This algorithm alternately solves subproblems of non-linear programming (NLP) and main problems of mixed-integer linear programming (MILP), in this way gradually refines the solution space up to the optimal solution. The NLP corresponds to the continuous optimization of parameters (with fixed topology, discrete materials and standard dimensions, all determined in the previous MILP), while the MILP involves a global approximation to the superstructure of alternatives, where a new topology, materials, standard dimensions are determined. The optimization of a convex problem is stopped when the MILP solution becomes better than the best NLP solution. Otherwise, it is terminated when the NLP solution can no longer be improved. While the OA/ER algorithm, like all other algorithms, does not guarantee global optimality due to the presence of non-convex functions, various modifications, including convexity tests, are implemented in OA/ER to mitigate these difficulties. The effectiveness of the proposed MINLP approach is demonstrated by its application to various structural optimization tasks, such as mass optimization of steel buildings, cost optimization of timber halls, composite floor systems, etc. Special optimization models have been developed for the optimization of these structures. The MINLP optimizations, facilitated by the user-friendly software package MIPSYN, provide insights into a mass or cost-optimal solutions, optimal structural topologies, optimal material and standard cross-section choices, confirming MINLP as a valuable method for the optimization of structures in civil engineering.

Keywords: MINLP, mixed-integer non-linear programming, optimization, structures

Procedia PDF Downloads 28
6837 Design, Analysis and Obstacle Avoidance Control of an Electric Wheelchair with Sit-Sleep-Seat Elevation Functions

Authors: Waleed Ahmed, Huang Xiaohua, Wilayat Ali

Abstract:

The wheelchair users are generally exposed to physical and psychological health problems, e.g., pressure sores and pain in the hip joint, associated with seating posture or being inactive in a wheelchair for a long time. Reclining Wheelchair with back, thigh, and leg adjustment helps in daily life activities and health preservation. The seat elevating function of an electric wheelchair allows the user (lower limb amputation) to reach different heights. An electric wheelchair is expected to ease the lives of the elderly and disable people by giving them mobility support and decreasing the percentage of accidents caused by users’ narrow sight or joystick operation errors. Thus, this paper proposed the design, analysis and obstacle avoidance control of an electric wheelchair with sit-sleep-seat elevation functions. A 3D model of a wheelchair is designed in SolidWorks that was later used for multi-body dynamic (MBD) analysis and to verify driving control system. The control system uses the fuzzy algorithm to avoid the obstacle by getting information in the form of distance from the ultrasonic sensor and user-specified direction from the joystick’s operation. The proposed fuzzy driving control system focuses on the direction and velocity of the wheelchair. The wheelchair model has been examined and proven in MSC Adams (Automated Dynamic Analysis of Mechanical Systems). The designed fuzzy control algorithm is implemented on Gazebo robotic 3D simulator using Robotic Operating System (ROS) middleware. The proposed wheelchair design enhanced mobility and quality of life by improving the user’s functional capabilities. Simulation results verify the non-accidental behavior of the electric wheelchair.

Keywords: fuzzy logic control, joystick, multi body dynamics, obstacle avoidance, scissor mechanism, sensor

Procedia PDF Downloads 119
6836 A Benchmark System for Testing Medium Voltage Direct Current (MVDC-CB) Robustness Utilizing Real Time Digital Simulation and Hardware-In-Loop Theory

Authors: Ali Kadivar, Kaveh Niayesh

Abstract:

The integration of green energy resources is a major focus, and the role of Medium Voltage Direct Current (MVDC) systems is exponentially expanding. However, the protection of MVDC systems against DC faults is a challenge that can have consequences on reliable and safe grid operation. This challenge reveals the need for MVDC circuit breakers (MVDC CB), which are in infancies of their improvement. Therefore will be a lack of MVDC CBs standards, including thresholds for acceptable power losses and operation speed. To establish a baseline for comparison purposes, a benchmark system for testing future MVDC CBs is vital. The literatures just give the timing sequence of each switch and the emphasis is on the topology, without in-depth study on the control algorithm of DCCB, as the circuit breaker control system is not yet systematic. A digital testing benchmark is designed for the Proof-of-concept of simulation studies using software models. It can validate studies based on real-time digital simulators and Transient Network Analyzer (TNA) models. The proposed experimental setup utilizes data accusation from the accurate sensors installed on the tested MVDC CB and through general purpose input/outputs (GPIO) from the microcontroller and PC Prototype studies in the laboratory-based models utilizing Hardware-in-the-Loop (HIL) equipment connected to real-time digital simulators is achieved. The improved control algorithm of the circuit breaker can reduce the peak fault current and avoid arc resignation, helping the coordination of DCCB in relay protection. Moreover, several research gaps are identified regarding case studies and evaluation approaches.

Keywords: DC circuit breaker, hardware-in-the-loop, real time digital simulation, testing benchmark

Procedia PDF Downloads 62
6835 Python Implementation for S1000D Applicability Depended Processing Model - SALERNO

Authors: Theresia El Khoury, Georges Badr, Amir Hajjam El Hassani, Stéphane N’Guyen Van Ky

Abstract:

The widespread adoption of machine learning and artificial intelligence across different domains can be attributed to the digitization of data over several decades, resulting in vast amounts of data, types, and structures. Thus, data processing and preparation turn out to be a crucial stage. However, applying these techniques to S1000D standard-based data poses a challenge due to its complexity and the need to preserve logical information. This paper describes SALERNO, an S1000d AppLicability dEpended pRocessiNg mOdel. This python-based model analyzes and converts the XML S1000D-based files into an easier data format that can be used in machine learning techniques while preserving the different logic and relationships in files. The model parses the files in the given folder, filters them, and extracts the required information to be saved in appropriate data frames and Excel sheets. Its main idea is to group the extracted information by applicability. In addition, it extracts the full text by replacing internal and external references while maintaining the relationships between files, as well as the necessary requirements. The resulting files can then be saved in databases and used in different models. Documents in both English and French languages were tested, and special characters were decoded. Updates on the technical manuals were taken into consideration as well. The model was tested on different versions of the S1000D, and the results demonstrated its ability to effectively handle the applicability, requirements, references, and relationships across all files and on different levels.

Keywords: aeronautics, big data, data processing, machine learning, S1000D

Procedia PDF Downloads 119
6834 A Trends Analysis of Yatch Simulator

Authors: Jae-Neung Lee, Keun-Chang Kwak

Abstract:

This paper describes an analysis of Yacht Simulator international trends and also explains about Yacht. Examples of yacht Simulator using Yacht Simulator include image processing for totaling the total number of vehicles, edge/target detection, detection and evasion algorithm, image processing using SIFT (scale invariant features transform) matching, and application of median filter and thresholding.

Keywords: yacht simulator, simulator, trends analysis, SIFT

Procedia PDF Downloads 417
6833 A Supervised Learning Data Mining Approach for Object Recognition and Classification in High Resolution Satellite Data

Authors: Mais Nijim, Rama Devi Chennuboyina, Waseem Al Aqqad

Abstract:

Advances in spatial and spectral resolution of satellite images have led to tremendous growth in large image databases. The data we acquire through satellites, radars and sensors consists of important geographical information that can be used for remote sensing applications such as region planning, disaster management. Spatial data classification and object recognition are important tasks for many applications. However, classifying objects and identifying them manually from images is a difficult task. Object recognition is often considered as a classification problem, this task can be performed using machine-learning techniques. Despite of many machine-learning algorithms, the classification is done using supervised classifiers such as Support Vector Machines (SVM) as the area of interest is known. We proposed a classification method, which considers neighboring pixels in a region for feature extraction and it evaluates classifications precisely according to neighboring classes for semantic interpretation of region of interest (ROI). A dataset has been created for training and testing purpose; we generated the attributes by considering pixel intensity values and mean values of reflectance. We demonstrated the benefits of using knowledge discovery and data-mining techniques, which can be on image data for accurate information extraction and classification from high spatial resolution remote sensing imagery.

Keywords: remote sensing, object recognition, classification, data mining, waterbody identification, feature extraction

Procedia PDF Downloads 323
6832 Helicopter Exhaust Gases Cooler in Terms of Computational Fluid Dynamics (CFD) Analysis

Authors: Mateusz Paszko, Ksenia Siadkowska

Abstract:

Due to the low-altitude and relatively low-speed flight, helicopters are easy targets for actual combat assets e.g. infrared-guided missiles. Current techniques aim to increase the combat effectiveness of the military helicopters. Protection of the helicopter in flight from early detection, tracking and finally destruction can be realized in many ways. One of them is cooling hot exhaust gasses, emitting from the engines to the atmosphere in special heat exchangers. Nowadays, this process is realized in ejective coolers, where strong heat and momentum exchange between hot exhaust gases and cold air ejected from atmosphere takes place. Flow effects of air, exhaust gases; mixture of those two and the heat transfer between cold air and hot exhaust gases are given by differential equations of: Mass transportation–flow continuity, ejection of cold air through expanding exhaust gasses, conservation of momentum, energy and physical relationship equations. Calculation of those processes in ejective cooler by means of classic mathematical analysis is extremely hard or even impossible. Because of this, it is necessary to apply the numeric approach with modern, numeric computer programs. The paper discussed the general usability of the Computational Fluid Dynamics (CFD) in a process of projecting the ejective exhaust gases cooler cooperating with helicopter turbine engine. In this work, the CFD calculations have been performed for ejective-based cooler cooperating with the PA W3 helicopter’s engines.

Keywords: aviation, CFD analysis, ejective-cooler, helicopter techniques

Procedia PDF Downloads 312
6831 A Framework of Dynamic Rule Selection Method for Dynamic Flexible Job Shop Problem by Reinforcement Learning Method

Authors: Rui Wu

Abstract:

In the volatile modern manufacturing environment, new orders randomly occur at any time, while the pre-emptive methods are infeasible. This leads to a real-time scheduling method that can produce a reasonably good schedule quickly. The dynamic Flexible Job Shop problem is an NP-hard scheduling problem that hybrid the dynamic Job Shop problem with the Parallel Machine problem. A Flexible Job Shop contains different work centres. Each work centre contains parallel machines that can process certain operations. Many algorithms, such as genetic algorithms or simulated annealing, have been proposed to solve the static Flexible Job Shop problems. However, the time efficiency of these methods is low, and these methods are not feasible in a dynamic scheduling problem. Therefore, a dynamic rule selection scheduling system based on the reinforcement learning method is proposed in this research, in which the dynamic Flexible Job Shop problem is divided into several parallel machine problems to decrease the complexity of the dynamic Flexible Job Shop problem. Firstly, the features of jobs, machines, work centres, and flexible job shops are selected to describe the status of the dynamic Flexible Job Shop problem at each decision point in each work centre. Secondly, a framework of reinforcement learning algorithm using a double-layer deep Q-learning network is applied to select proper composite dispatching rules based on the status of each work centre. Then, based on the selected composite dispatching rule, an available operation is selected from the waiting buffer and assigned to an available machine in each work centre. Finally, the proposed algorithm will be compared with well-known dispatching rules on objectives of mean tardiness, mean flow time, mean waiting time, or mean percentage of waiting time in the real-time Flexible Job Shop problem. The result of the simulations proved that the proposed framework has reasonable performance and time efficiency.

Keywords: dynamic scheduling problem, flexible job shop, dispatching rules, deep reinforcement learning

Procedia PDF Downloads 88
6830 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 386
6829 A Convolutional Neural Network-Based Model for Lassa fever Virus Prediction Using Patient Blood Smear Image

Authors: A. M. John-Otumu, M. M. Rahman, M. C. Onuoha, E. P. Ojonugwa

Abstract:

A Convolutional Neural Network (CNN) model for predicting Lassa fever was built using Python 3.8.0 programming language, alongside Keras 2.2.4 and TensorFlow 2.6.1 libraries as the development environment in order to reduce the current high risk of Lassa fever in West Africa, particularly in Nigeria. The study was prompted by some major flaws in existing conventional laboratory equipment for diagnosing Lassa fever (RT-PCR), as well as flaws in AI-based techniques that have been used for probing and prognosis of Lassa fever based on literature. There were 15,679 blood smear microscopic image datasets collected in total. The proposed model was trained on 70% of the dataset and tested on 30% of the microscopic images in avoid overfitting. A 3x3x3 convolution filter was also used in the proposed system to extract features from microscopic images. The proposed CNN-based model had a recall value of 96%, a precision value of 93%, an F1 score of 95%, and an accuracy of 94% in predicting and accurately classifying the images into clean or infected samples. Based on empirical evidence from the results of the literature consulted, the proposed model outperformed other existing AI-based techniques evaluated. If properly deployed, the model will assist physicians, medical laboratory scientists, and patients in making accurate diagnoses for Lassa fever cases, allowing the mortality rate due to the Lassa fever virus to be reduced through sound decision-making.

Keywords: artificial intelligence, ANN, blood smear, CNN, deep learning, Lassa fever

Procedia PDF Downloads 97
6828 Effect of Highway Construction on Soil Properties and Soil Organic Carbon (Soc) Along Lagos-Badagry Expressway, Lagos, Nigeria

Authors: Fatai Olakunle Ogundele

Abstract:

Road construction is increasingly common in today's world as human development expands and people increasingly rely on cars for transportation on a daily basis. The construction of a large network of roads has dramatically altered the landscape and impacted well-being in a number of deleterious ways. In addition, the road can also shift population demographics and be a source of pollution into the environment. Road construction activities normally result in changes in alteration of the soil's physical properties through soil compaction on the road itself and on adjacent areas and chemical and biological properties, among other effects. Understanding roadside soil properties that are influenced by road construction activities can serve as a basis for formulating conservation-based management strategies. Therefore, this study examined the effects of road construction on soil properties and soil organic carbon along Lagos Badagry Expressway, Lagos, Nigeria. The study adopted purposive sampling techniques and 40 soil samples were collected at a depth of 0 – 30cm from each of the identified road intersections and infrastructures using a soil auger. The soil samples collected were taken to the laboratory for soil properties and carbon stock analysis using standard methods. Both descriptive and inferential statistical techniques were applied to analyze the data obtained. The results revealed that soil compaction inhibits ecological succession on roadsides in that increased compaction suppresses plant growth as well as causes changes in soil quality.

Keywords: highway, soil properties, organic carbon, road construction, land degradation

Procedia PDF Downloads 58
6827 A Flute Tracking System for Monitoring the Wear of Cutting Tools in Milling Operations

Authors: Hatim Laalej, Salvador Sumohano-Verdeja, Thomas McLeay

Abstract:

Monitoring of tool wear in milling operations is essential for achieving the desired dimensional accuracy and surface finish of a machined workpiece. Although there are numerous statistical models and artificial intelligence techniques available for monitoring the wear of cutting tools, these techniques cannot pin point which cutting edge of the tool, or which insert in the case of indexable tooling, is worn or broken. Currently, the task of monitoring the wear on the tool cutting edges is carried out by the operator who performs a manual inspection, causing undesirable stoppages of machine tools and consequently resulting in costs incurred from lost productivity. The present study is concerned with the development of a flute tracking system to segment signals related to each physical flute of a cutter with three flutes used in an end milling operation. The purpose of the system is to monitor the cutting condition for individual flutes separately in order to determine their progressive wear rates and to predict imminent tool failure. The results of this study clearly show that signals associated with each flute can be effectively segmented using the proposed flute tracking system. Furthermore, the results illustrate that by segmenting the sensor signal by flutes it is possible to investigate the wear in each physical cutting edge of the cutting tool. These findings are significant in that they facilitate the online condition monitoring of a cutting tool for each specific flute without the need for operators/engineers to perform manual inspections of the tool.

Keywords: machining, milling operation, tool condition monitoring, tool wear prediction

Procedia PDF Downloads 291
6826 A Machine Learning Based Framework for Education Levelling in Multicultural Countries: UAE as a Case Study

Authors: Shatha Ghareeb, Rawaa Al-Jumeily, Thar Baker

Abstract:

In Abu Dhabi, there are many different education curriculums where sector of private schools and quality assurance is supervising many private schools in Abu Dhabi for many nationalities. As there are many different education curriculums in Abu Dhabi to meet expats’ needs, there are different requirements for registration and success. In addition, there are different age groups for starting education in each curriculum. In fact, each curriculum has a different number of years, assessment techniques, reassessment rules, and exam boards. Currently, students that transfer curriculums are not being placed in the right year group due to different start and end dates of each academic year and their date of birth for each year group is different for each curriculum and as a result, we find students that are either younger or older for that year group which therefore creates gaps in their learning and performance. In addition, there is not a way of storing student data throughout their academic journey so that schools can track the student learning process. In this paper, we propose to develop a computational framework applicable in multicultural countries such as UAE in which multi-education systems are implemented. The ultimate goal is to use cloud and fog computing technology integrated with Artificial Intelligence techniques of Machine Learning to aid in a smooth transition when assigning students to their year groups, and provide leveling and differentiation information of students who relocate from a particular education curriculum to another, whilst also having the ability to store and access student data from anywhere throughout their academic journey.

Keywords: admissions, algorithms, cloud computing, differentiation, fog computing, levelling, machine learning

Procedia PDF Downloads 127