Search results for: expanded invasive weed optimization algorithm (exIWO)
1863 Solving Dimensionality Problem and Finding Statistical Constructs on Latent Regression Models: A Novel Methodology with Real Data Application
Authors: Sergio Paez Moncaleano, Alvaro Mauricio Montenegro
Abstract:
This paper presents a novel statistical methodology for measuring and founding constructs in Latent Regression Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations on Item Response Theory (IRT). In addition, based on the fundamentals of submodel theory and with a convergence of many ideas of IRT, we propose an algorithm not just to solve the dimensionality problem (nowadays an open discussion) but a new research field that promises more fear and realistic qualifications for examiners and a revolution on IRT and educational research. In the end, the methodology is applied to a set of real data set presenting impressive results for the coherence, speed and precision. Acknowledgments: This research was financed by Colciencias through the project: 'Multidimensional Item Response Theory Models for Practical Application in Large Test Designed to Measure Multiple Constructs' and both authors belong to SICS Research Group from Universidad Nacional de Colombia.Keywords: item response theory, dimensionality, submodel theory, factorial analysis
Procedia PDF Downloads 3741862 Optimizing Scribe Resourcing to Improve Hospitalist Workloads
Authors: Ahmed Hamzi, Bryan Norman
Abstract:
Having scribes help document patient records in electronic health record systems can improve hospitalists’ productivity. But hospitals need to determine the optimum number of scribes to hire to maximize scribe cost effectiveness. Scribe attendance uncertainty due to planned and unplanned absences is a primary challenge. This paper presents simulation and analytical models to determine the optimum number of scribes for a hospital to hire. Scribe staffing practices vary from one context to another; different staffing scenarios are considered where having extra attending scribes provides or does not provide additional value and utilizing on-call scribes to fill in for potentially absent scribes. These staffing scenarios are assessed for different scribe revenue ratios (ratio of the value of the scribe relative to scribe costs) ranging from 100% to 300%. The optimum solution depends on the absenteeism rate, revenue ratio, and desired service level. The analytical model obtains solutions easier and faster than the simulation model, but the simulation model is more accurate. Therefore, the analytical model’s solutions are compared with the simulation model’s solutions regarding both the number of scribes hired and cost-effectiveness. Additionally, an Excel tool has been developed to facilitate decision-makers in easily obtaining solutions using the analytical model.Keywords: hospitalists, workload, optimization cost, economic analysis
Procedia PDF Downloads 491861 Optimization and Evaluation of the Oil Extraction Process Using Supercritical CO2 and Co-solvents from Spent Coffee Ground
Authors: Sergio Clemente, Carla Bartolomé, Miriam Lorenzo, Sergio Valverde
Abstract:
The generation of urban waste is a consequence of human activity, and the fraction of urban organic waste is one of the major components of municipal waste. The development of new materials and energy recovery technologies is becoming a thriving topic throughout Europe. ITENE is working to increase the circularity of coffee grounds from West Macedonia. Although these residues have a high content of carbohydrates, fatty acids and polyphenols, they are usually valorized energetically or discarded, losing all these compounds of interest. ITENE is studying the extraction of oils from spent coffee grounds using supercritical CO2, as it is a more sustainable method and does not destroy the most valuable compounds. In the HOOP project, the extraction process is optimized to maximize oil production and the possibility of using co-solvents together with supercritical CO2 is studied. The production of fatty acids by scCO2 extraction is optimized and then compared with other conventional extraction methods such as hexane extraction and the Folch method. The conditions for scCO2 were temperatures of 313.15K, 323.15K and 333.15K, pressures from 150 bar to 200 bar, and extraction times between 1 and 3 h. In addition, a complete characterization of the resulting lipid fraction is performed to evaluate its fatty acid content and profile, as well as its antioxidant properties, lipid oxidation, total phenol content and moisture.Keywords: Supercritical co2, coffee, valorization, extraction
Procedia PDF Downloads 101860 Representativity Based Wasserstein Active Regression
Authors: Benjamin Bobbia, Matthias Picard
Abstract:
In recent years active learning methodologies based on the representativity of the data seems more promising to limit overfitting. The presented query methodology for regression using the Wasserstein distance measuring the representativity of our labelled dataset compared to the global distribution. In this work a crucial use of GroupSort Neural Networks is made therewith to draw a double advantage. The Wasserstein distance can be exactly expressed in terms of such neural networks. Moreover, one can provide explicit bounds for their size and depth together with rates of convergence. However, heterogeneity of the dataset is also considered by weighting the Wasserstein distance with the error of approximation at the previous step of active learning. Such an approach leads to a reduction of overfitting and high prediction performance after few steps of query. After having detailed the methodology and algorithm, an empirical study is presented in order to investigate the range of our hyperparameters. The performances of this method are compared, in terms of numbers of query needed, with other classical and recent query methods on several UCI datasets.Keywords: active learning, Lipschitz regularization, neural networks, optimal transport, regression
Procedia PDF Downloads 831859 The Effect of Damping Treatment for Noise Control on Offshore Platforms Using Statistical Energy Analysis
Authors: Ji Xi, Cheng Song Chin, Ehsan Mesbahi
Abstract:
Structure-borne noise is an important aspect of offshore platform sound field. It can be generated either directly by vibrating machineries induced mechanical force, indirectly by the excitation of structure or excitation by incident airborne noise. Therefore, limiting of the transmission of vibration energy throughout the offshore platform is the key to control the structure-borne noise. This is usually done by introducing damping treatment to the steel structures. Two types of damping treatment using on-board are presented. By conducting a statistical energy analysis (SEA) simulation on a jack-up rig, the noise level in the source room, the neighboring rooms, and remote living quarter cabins are compared before and after the damping treatments been applied. The results demonstrated that, in the source neighboring room and living quarter area, there is a significant noise reduction with the damping treatment applied, whereas in the source room where air-borne sound predominates that of structure-borne sound, the impact is not obvious. The subsequent optimization design of damping treatment in the offshore platform can be made which enable acoustic professionals to implement noise control during the design stage for offshore crews’ hearing protection and habitant comfortability.Keywords: statistical energy analysis, damping treatment, noise control, offshore platform
Procedia PDF Downloads 5571858 Monitoring the Effect of Doxorubicin Liposomal in VX2 Tumor Using Magnetic Resonance Imaging
Authors: Ren-Jy Ben, Jo-Chi Jao, Chiu-Ya Liao, Ya-Ru Tsai, Lain-Chyr Hwang, Po-Chou Chen
Abstract:
Cancer is still one of the serious diseases threatening the lives of human beings. How to have an early diagnosis and effective treatment for tumors is a very important issue. The animal carcinoma model can provide a simulation tool for the study of pathogenesis, biological characteristics and therapeutic effects. Recently, drug delivery systems have been rapidly developed to effectively improve the therapeutic effects. Liposome plays an increasingly important role in clinical diagnosis and therapy for delivering a pharmaceutic or contrast agent to the targeted sites. Liposome can be absorbed and excreted by the human body, and is well known that no harm to the human body. This study aimed to compare the therapeutic effects between encapsulated (doxorubicin liposomal, LipoDox) and un-encapsulated (doxorubicin, Dox) anti-tumor drugs using Magnetic Resonance Imaging (MRI). Twenty-four New Zealand rabbits implanted with VX2 carcinoma at left thigh were classified into three groups: control group (untreated), Dox-treated group and LipoDox-treated group, 8 rabbits for each group. MRI scans were performed three days after tumor implantation. A 1.5T GE Signa HDxt whole body MRI scanner with a high resolution knee coil was used in this study. After a 3-plane localizer scan was performed, Three-Dimensional (3D) Fast Spin Echo (FSE) T2-Weighted Images (T2WI) was used for tumor volumetric quantification. And Two-Dimensional (2D) spoiled gradient recalled echo (SPGR) dynamic Contrast-enhanced (DCE) MRI was used for tumor perfusion evaluation. DCE-MRI was designed to acquire four baseline images, followed by contrast agent Gd-DOTA injection through the ear vein of rabbits. Afterwards, a series of 32 images were acquired to observe the signals change over time in the tumor and muscle. The MRI scanning was scheduled on a weekly basis for a period of four weeks to observe the tumor progression longitudinally. The Dox and LipoDox treatments were prescribed 3 times in the first week immediately after VX2 tumor implantation. ImageJ was used to quantitate tumor volume and time course signal enhancement on DCE images. The changes of tumor size showed that the growth of VX2 tumors was effectively inhibited for both LipoDox-treated and Dox-treated groups. Furthermore, the tumor volume of LipoDox-treated group was significantly lower than that of Dox-treated group, which implies that LipoDox has better therapeutic effect than Dox. The signal intensity of LipoDox-treated group is significantly lower than that of the other two groups, which implies that targeted therapeutic drug remained in the tumor tissue. This study provides a radiation-free and non-invasive MRI method for therapeutic monitoring of targeted liposome on an animal tumor model.Keywords: doxorubicin, dynamic contrast-enhanced MRI, lipodox, magnetic resonance imaging, VX2 tumor model
Procedia PDF Downloads 4601857 Silymarin Loaded Mesoporous Silica Nanoparticles: Preparation, Optimization, Pharmacodynamic and Oral Multi-Dose Safety Assessment
Authors: Sarah Nasr, Maha M. A. Nasra, Ossama Y. Abdallah
Abstract:
The present work aimed to prepare Silymarin loaded MCM-41 type mesoporous silica nanoparticles (MSNs) and to assess the system’s solubility enhancement ability on the pharmacodynamic performance of Silymarin as a hepatoprotective agent. MSNs prepared by soft-templating technique, were loaded with Silymarin, characterized for particle size, zeta potential, surface properties, DSC and XRPD. DSC and specific surface area data confirmed deposition of Silymarin in an amorphous state in MSNs’ pores. In-vitro drug dissolution testing displayed enhanced dissolution rate of Silymarin upon loading on MSNs. High dose Acetaminophen was then used to inflict hepatic injury upon albino male Wistar rats simultaneously receiving either free Silymarin, Silymarin loaded MSNs or blank MSNs. Plasma AST, ALT, albumin and total protein and liver homogenate content of TBARs or LDH as measures of antioxidant drug action were assessed for all animal groups. Results showed a significant superiority of Silymarin loaded MSNs to free drug in almost all parameters. Meanwhile prolonged administration of blank MSNs had no evident toxicity on rats.Keywords: mesoporous silica nanoparticles, safety, solubility enhancement, silymarin
Procedia PDF Downloads 3351856 Analysis of Stress and Strain in Head Based Control of Cooperative Robots through Tetraplegics
Authors: Jochen Nelles, Susanne Kohns, Julia Spies, Friederike Schmitz-Buhl, Roland Thietje, Christopher Brandl, Alexander Mertens, Christopher M. Schlick
Abstract:
Industrial robots as part of highly automated manufacturing are recently developed to cooperative (light-weight) robots. This offers the opportunity of using them as assistance robots and to improve the participation in professional life of disabled or handicapped people such as tetraplegics. Robots under development are located within a cooperation area together with the working person at the same workplace. This cooperation area is an area where the robot and the working person can perform tasks at the same time. Thus, working people and robots are operating in the immediate proximity. Considering the physical restrictions and the limited mobility of tetraplegics, a hands-free robot control could be an appropriate approach for a cooperative assistance robot. To meet these requirements, the research project MeRoSy (human-robot synergy) develops methods for cooperative assistance robots based on the measurement of head movements of the working person. One research objective is to improve the participation in professional life of people with disabilities and, in particular, mobility impaired persons (e.g. wheelchair users or tetraplegics), whose participation in a self-determined working life is denied. This raises the research question, how a human-robot cooperation workplace can be designed for hands-free robot control. Here, the example of a library scenario is demonstrated. In this paper, an empirical study that focuses on the impact of head movement related stress is presented. 12 test subjects with tetraplegia participated in the study. Tetraplegia also known as quadriplegia is the worst type of spinal cord injury. In the experiment, three various basic head movements were examined. Data of the head posture were collected by a motion capture system; muscle activity was measured via surface electromyography and the subjective mental stress was assessed via a mental effort questionnaire. The muscle activity was measured for the sternocleidomastoid (SCM), the upper trapezius (UT) or trapezius pars descendens, and the splenius capitis (SPL) muscle. For this purpose, six non-invasive surface electromyography sensors were mounted on the head and neck area. An analysis of variance shows differentiated muscular strains depending on the type of head movement. Systematically investigating the influence of different basic head movements on the resulting strain is an important issue to relate the research results to other scenarios. At the end of this paper, a conclusion will be drawn and an outlook of future work will be presented.Keywords: assistance robot, human-robot interaction, motion capture, stress-strain-concept, surface electromyography, tetraplegia
Procedia PDF Downloads 3161855 Double Encrypted Data Communication Using Cryptography and Steganography
Authors: Adine Barett, Jermel Watson, Anteneh Girma, Kacem Thabet
Abstract:
In information security, secure communication of data across networks has always been a problem at the forefront. Transfer of information across networks is susceptible to being exploited by attackers engaging in malicious activity. In this paper, we leverage steganography and cryptography to create a layered security solution to protect the information being transmitted. The first layer of security leverages crypto- graphic techniques to scramble the information so that it cannot be deciphered even if the steganography-based layer is compromised. The second layer of security relies on steganography to disguise the encrypted in- formation so that it cannot be seen. We consider three cryptographic cipher methods in the cryptography layer, namely, Playfair cipher, Blowfish cipher, and Hills cipher. Then, the encrypted message is passed through the least significant bit (LSB) to the steganography algorithm for further encryption. Both encryption approaches are combined efficiently to help secure information in transit over a network. This multi-layered encryption is a solution that will benefit cloud platforms, social media platforms and networks that regularly transfer private information such as banks and insurance companies.Keywords: cryptography, steganography, layered security, Cipher, encryption
Procedia PDF Downloads 871854 Efects of Data Corelation in a Sparse-View Compresive Sensing Based Image Reconstruction
Authors: Sajid Abas, Jon Pyo Hong, Jung-Ryun Le, Seungryong Cho
Abstract:
Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.Keywords: computed tomography, computed laminography, compressive sending, low-dose
Procedia PDF Downloads 4661853 Component Based Testing Using Clustering and Support Vector Machine
Authors: Iqbaldeep Kaur, Amarjeet Kaur
Abstract:
Software Reusability is important part of software development. So component based software development in case of software testing has gained a lot of practical importance in the field of software engineering from academic researcher and also from software development industry perspective. Finding test cases for efficient reuse of test cases is one of the important problems aimed by researcher. Clustering reduce the search space, reuse test cases by grouping similar entities according to requirements ensuring reduced time complexity as it reduce the search time for retrieval the test cases. In this research paper we proposed approach for re-usability of test cases by unsupervised approach. In unsupervised learning we proposed k-mean and Support Vector Machine. We have designed the algorithm for requirement and test case document clustering according to its tf-idf vector space and the output is set of highly cohesive pattern groups.Keywords: software testing, reusability, clustering, k-mean, SVM
Procedia PDF Downloads 4321852 Constructing White-Box Implementations Based on Threshold Shares and Composite Fields
Authors: Tingting Lin, Manfred von Willich, Dafu Lou, Phil Eisen
Abstract:
A white-box implementation of a cryptographic algorithm is a software implementation intended to resist extraction of the secret key by an adversary. To date, most of the white-box techniques are used to protect block cipher implementations. However, a large proportion of the white-box implementations are proven to be vulnerable to affine equivalence attacks and other algebraic attacks, as well as differential computation analysis (DCA). In this paper, we identify a class of block ciphers for which we propose a method of constructing white-box implementations. Our method is based on threshold implementations and operations in composite fields. The resulting implementations consist of lookup tables and few exclusive OR operations. All intermediate values (inputs and outputs of the lookup tables) are masked. The threshold implementation makes the distribution of the masked values uniform and independent of the original inputs, and the operations in composite fields reduce the size of the lookup tables. The white-box implementations can provide resistance against algebraic attacks and DCA-like attacks.Keywords: white-box, block cipher, composite field, threshold implementation
Procedia PDF Downloads 1721851 Challenges and Opportunities for Implementing Integrated Project Delivery Method in Public Sector Construction
Authors: Ahsan Ahmed, Ming Lu, Syed Zaidi, Farhan Khan
Abstract:
The Integrated Project Delivery (IPD) method has been proposed as the solution to tackle complexity and fragmentation in the real world while addressing the construction industry’s growing needs for productivity and sustainability. Although the private sector has taken the initiative in implementing IPD and taken advantage of new technology such as building information modeling (BIM) in delivering projects, IPD remains less known and rarely used in public sector construction. The focus of this paper is set on the use of IPD in projects in public sector, which is potentially complemented by the use of analytical functionalities for workface planning and construction oriented design enabled by recent research advances in BIM. Experiences and lessons learned from implementing IPD in the private sector and in BIM-based construction automation research would play a vital role in reducing barriers and eliminating issues in connection with project delivery in the public sector. The paper elaborates issues challenges, contractual relationships and the interactions throughout the planning, design and construction phases in the context of implementing IPD on construction projects in the public sector. A slab construction case is used as a ‘sandbox’ model to elaborate (1) the ideal way of communication, integration, and collaboration among all the parties involved in project delivery in planning and (2) the execution of projects by using IDP principles and optimization, simulation analyses.Keywords: integrated project delivery, IPD, building information modeling, BIM
Procedia PDF Downloads 2041850 Model of Transhipment and Routing Applied to the Cargo Sector in Small and Medium Enterprises of Bogotá, Colombia
Authors: Oscar Javier Herrera Ochoa, Ivan Dario Romero Fonseca
Abstract:
This paper presents a design of a model for planning the distribution logistics operation. The significance of this work relies on the applicability of this fact to the analysis of small and medium enterprises (SMEs) of dry freight in Bogotá. Two stages constitute this implementation: the first one is the place where optimal planning is achieved through a hybrid model developed with mixed integer programming, which considers the transhipment operation based on a combined load allocation model as a classic transshipment model; the second one is the specific routing of that operation through the heuristics of Clark and Wright. As a result, an integral model is obtained to carry out the step by step planning of the distribution of dry freight for SMEs in Bogotá. In this manner, optimum assignments are established by utilizing transshipment centers with that purpose of determining the specific routing based on the shortest distance traveled.Keywords: transshipment model, mixed integer programming, saving algorithm, dry freight transportation
Procedia PDF Downloads 2351849 Degraded Document Analysis and Extraction of Original Text Document: An Approach without Optical Character Recognition
Authors: L. Hamsaveni, Navya Prakash, Suresha
Abstract:
Document Image Analysis recognizes text and graphics in documents acquired as images. An approach without Optical Character Recognition (OCR) for degraded document image analysis has been adopted in this paper. The technique involves document imaging methods such as Image Fusing and Speeded Up Robust Features (SURF) Detection to identify and extract the degraded regions from a set of document images to obtain an original document with complete information. In case, degraded document image captured is skewed, it has to be straightened (deskew) to perform further process. A special format of image storing known as YCbCr is used as a tool to convert the Grayscale image to RGB image format. The presented algorithm is tested on various types of degraded documents such as printed documents, handwritten documents, old script documents and handwritten image sketches in documents. The purpose of this research is to obtain an original document for a given set of degraded documents of the same source.Keywords: grayscale image format, image fusing, RGB image format, SURF detection, YCbCr image format
Procedia PDF Downloads 3771848 Internet of Things Edge Device Power Modelling and Optimization Simulator
Authors: Cian O'Shea, Ross O'Halloran, Peter Haigh
Abstract:
Wireless Sensor Networks (WSN) are Internet of Things (IoT) edge devices. They are becoming widely adopted in many industries, including health care, building energy management, and conditional monitoring. As the scale of WSN deployments increases, the cost and complexity of battery replacement and disposal become more significant and in time may become a barrier to adoption. Harvesting ambient energies provide a pathway to reducing dependence on batteries and in the future may lead to autonomously powered sensors. This work describes a simulation tool that enables the user to predict the battery life of a wireless sensor that utilizes energy harvesting to supplement the battery power. To create this simulator, all aspects of a typical WSN edge device were modelled including, sensors, transceiver, and microcontroller as well as the energy source components (batteries, solar cells, thermoelectric generators (TEG), supercapacitors and DC/DC converters). The tool allows the user to plug and play different pre characterized devices as well as add user-defined devices. The goal of this simulation tool is to predict the lifetime of a device and scope for extension using ambient energy sources.Keywords: Wireless Sensor Network, IoT, edge device, simulation, solar cells, TEG, supercapacitor, energy harvesting
Procedia PDF Downloads 1331847 Reduction in Hot Metal Silicon through Statistical Analysis at G-Blast Furnace, Tata Steel Jamshedpur
Authors: Shoumodip Roy, Ankit Singhania, Santanu Mallick, Abhiram Jha, M. K. Agarwal, R. V. Ramna, Uttam Singh
Abstract:
The quality of hot metal at any blast furnace is judged by the silicon content in it. Lower hot metal silicon not only enhances process efficiency at steel melting shops but also reduces hot metal costs. The Hot metal produced at G-Blast furnace Tata Steel Jamshedpur has a significantly higher Si content than Benchmark Blast furnaces. The higher content of hot metal Si is mainly due to inferior raw material quality than those used in benchmark blast furnaces. With minimum control over raw material quality, the only option left to control hot metal Si is via optimizing the furnace parameters. Therefore, in order to identify the levers to reduce hot metal Si, Data mining was carried out, and multiple regression models were developed. The statistical analysis revealed that Slag B3{(CaO+MgO)/SiO2}, Slag Alumina and Hot metal temperature are key controllable parameters affecting hot metal silicon. Contour Plots were used to determine the optimum range of levels identified through statistical analysis. A trial plan was formulated to operate relevant parameters, at G blast furnace, in the identified range to reduce hot metal silicon. This paper details out the process followed and subsequent reduction in hot metal silicon by 15% at G blast furnace.Keywords: blast furnace, optimization, silicon, statistical tools
Procedia PDF Downloads 2251846 High Secure Data Hiding Using Cropping Image and Least Significant Bit Steganography
Authors: Khalid A. Al-Afandy, El-Sayyed El-Rabaie, Osama Salah, Ahmed El-Mhalaway
Abstract:
This paper presents a high secure data hiding technique using image cropping and Least Significant Bit (LSB) steganography. The predefined certain secret coordinate crops will be extracted from the cover image. The secret text message will be divided into sections. These sections quantity is equal the image crops quantity. Each section from the secret text message will embed into an image crop with a secret sequence using LSB technique. The embedding is done using the cover image color channels. Stego image is given by reassembling the image and the stego crops. The results of the technique will be compared to the other state of art techniques. Evaluation is based on visualization to detect any degradation of stego image, the difficulty of extracting the embedded data by any unauthorized viewer, Peak Signal-to-Noise Ratio of stego image (PSNR), and the embedding algorithm CPU time. Experimental results ensure that the proposed technique is more secure compared with the other traditional techniques.Keywords: steganography, stego, LSB, crop
Procedia PDF Downloads 2711845 Towards Computational Fluid Dynamics Based Methodology to Accelerate Bioprocess Scale Up and Scale Down
Authors: Vishal Kumar Singh
Abstract:
Bioprocess development is a time-constrained activity aimed at harnessing the full potential of culture performance in an ambience that is not natural to cells. Even with the use of chemically defined media and feeds, a significant amount of time is devoted in identifying the apt operating parameters. In addition, the scale-up of these processes is often accompanied by loss of antibody titer and product quality, which further delays the commercialization of the drug product. In such a scenario, the investigation of this disparity of culture performance is done by further experimentation at a smaller scale that is representative of at-scale production bioreactors. These scale-down model developments are also time-intensive. In this study, a computation fluid dynamics-based multi-objective scaling approach has been illustrated to speed up the process transfer. For the implementation of this approach, a transient multiphase water-air system has been studied in Ansys CFX to visualize the air bubble distribution and volumetric mass transfer coefficient (kLa) profiles, followed by the design of experiment based parametric optimization approach to define the operational space. The proposed approach is completely in silico and requires minimum experimentation, thereby rendering a high throughput to the overall process development.Keywords: bioprocess development, scale up, scale down, computation fluid dynamics, multi-objective, Ansys CFX, design of experiment
Procedia PDF Downloads 841844 2D Numerical Modeling for Induced Current Distribution in Soil under Lightning Impulse Discharge
Authors: Fawwaz Eniola Fajingbesi, Nur Shahida Midia, Elsheikh M. A. Elsheikh, Siti Hajar Yusoff
Abstract:
Empirical analysis of lightning related phenomena in real time is extremely dangerous due to the relatively high electric discharge involved. Hence, design and optimization of efficient grounding systems depending on real time empirical methods are impeded. Using numerical methods, the dynamics of complex systems could be modeled hence solved as sets of linear and non-linear systems . In this work, the induced current distribution as lightning strike traverses the soil have been numerically modeled in a 2D axial-symmetry and solved using finite element method (FEM) in COMSOL Multiphysics 5.2 AC/DC module. Stratified and non- stratified electrode system were considered in the solved model and soil conductivity (σ) varied between 10 – 58 mS/m. The result discussed therein were the electric field distribution, current distribution and soil ionization phenomena. It can be concluded that the electric field and current distribution is influenced by the injected electric potential and the non-linearity in soil conductivity. The result from numerical calculation also agrees with previously laboratory scale empirical results.Keywords: current distribution, grounding systems, lightning discharge, numerical model, soil conductivity, soil ionization
Procedia PDF Downloads 3141843 DFIG-Based Wind Turbine with Shunt Active Power Filter Controlled by Double Nonlinear Predictive Controller
Authors: Abderrahmane El Kachani, El Mahjoub Chakir, Anass Ait Laachir, Abdelhamid Niaaniaa, Jamal Zerouaoui, Tarik Jarou
Abstract:
This paper presents a wind turbine based on the doubly fed induction generator (DFIG) connected to the utility grid through a shunt active power filter (SAPF). The whole system is controlled by a double nonlinear predictive controller (DNPC). A Taylor series expansion is used to predict the outputs of the system. The control law is calculated by optimization of the cost function. The first nonlinear predictive controller (NPC) is designed to ensure the high performance tracking of the rotor speed and regulate the rotor current of the DFIG, while the second one is designed to control the SAPF in order to compensate the harmonic produces by the three-phase diode bridge supplied by a passive circuit (rd, Ld). As a result, we obtain sinusoidal waveforms of the stator voltage and stator current. The proposed nonlinear predictive controllers (NPCs) are validated via simulation on a 1.5 MW DFIG-based wind turbine connected to an SAPF. The results obtained appear to be satisfactory and promising.Keywords: wind power, doubly fed induction generator, shunt active power filter, double nonlinear predictive controller
Procedia PDF Downloads 4191842 Spin Rate Decaying Law of Projectile with Hemispherical Head in Exterior Trajectory
Authors: Quan Wen, Tianxiao Chang, Shaolu Shi, Yushi Wang, Guangyu Wang
Abstract:
As a kind of working environment of the fuze, the spin rate decaying law of projectile in exterior trajectory is of great value in the design of the rotation count fixed distance fuze. In addition, it is significant in the field of devices for simulation tests of fuze exterior ballistic environment, flight stability, and dispersion accuracy of gun projectile and opening and scattering design of submunition and illuminating cartridges. Besides, the self-destroying mechanism of the fuze in small-caliber projectile often works by utilizing the attenuation of centrifugal force. In the theory of projectile aerodynamics and fuze design, there are many formulas describing the change law of projectile angular velocity in external ballistic such as Roggla formula, exponential function formula, and power function formula. However, these formulas are mostly semi-empirical due to the poor test conditions and insufficient test data at that time. These formulas are difficult to meet the design requirements of modern fuze because they are not accurate enough and have a narrow range of applications now. In order to provide more accurate ballistic environment parameters for the design of a hemispherical head projectile fuze, the projectile’s spin rate decaying law in exterior trajectory under the effect of air resistance was studied. In the analysis, the projectile shape was simplified as hemisphere head, cylindrical part, rotating band part, and anti-truncated conical tail. The main assumptions are as follows: a) The shape and mass are symmetrical about the longitudinal axis, b) There is a smooth transition between the ball hea, c) The air flow on the outer surface is set as a flat plate flow with the same area as the expanded outer surface of the projectile, and the boundary layer is turbulent, d) The polar damping moment attributed to the wrench hole and rifling mark on the projectile is not considered, e) The groove of the rifle on the rotating band is uniform, smooth and regular. The impacts of the four parts on aerodynamic moment of the projectile rotation were obtained by aerodynamic theory. The surface friction stress of the projectile, the polar damping moment formed by the head of the projectile, the surface friction moment formed by the cylindrical part, the rotating band, and the anti-truncated conical tail were obtained by mathematical derivation. After that, the mathematical model of angular spin rate attenuation was established. In the whole trajectory with the maximum range angle (38°), the absolute error of the polar damping torque coefficient obtained by simulation and the coefficient calculated by the mathematical model established in this paper is not more than 7%. Therefore, the credibility of the mathematical model was verified. The mathematical model can be described as a first-order nonlinear differential equation, which has no analytical solution. The solution can be only gained as a numerical solution by connecting the model with projectile mass motion equations in exterior ballistics.Keywords: ammunition engineering, fuze technology, spin rate, numerical simulation
Procedia PDF Downloads 1501841 Syllogistic Reasoning with 108 Inference Rules While Case Quantities Change
Authors: Mikhail Zarechnev, Bora I. Kumova
Abstract:
A syllogism is a deductive inference scheme used to derive a conclusion from a set of premises. In a categorical syllogisms, there are only two premises and every premise and conclusion is given in form of a quantified relationship between two objects. The different order of objects in premises give classification known as figures. We have shown that the ordered combinations of 3 generalized quantifiers with certain figure provide in total of 108 syllogistic moods which can be considered as different inference rules. The classical syllogistic system allows to model human thought and reasoning with syllogistic structures always attracted the attention of cognitive scientists. Since automated reasoning is considered as part of learning subsystem of AI agents, syllogistic system can be applied for this approach. Another application of syllogistic system is related to inference mechanisms on the Semantic Web applications. In this paper we proposed the mathematical model and algorithm for syllogistic reasoning. Also the model of iterative syllogistic reasoning in case of continuous flows of incoming data based on case–based reasoning and possible applications of proposed system were discussed.Keywords: categorical syllogism, case-based reasoning, cognitive architecture, inference on the semantic web, syllogistic reasoning
Procedia PDF Downloads 4151840 Performance Evaluation of Distributed Deep Learning Frameworks in Cloud Environment
Authors: Shuen-Tai Wang, Fang-An Kuo, Chau-Yi Chou, Yu-Bin Fang
Abstract:
2016 has become the year of the Artificial Intelligence explosion. AI technologies are getting more and more matured that most world well-known tech giants are making large investment to increase the capabilities in AI. Machine learning is the science of getting computers to act without being explicitly programmed, and deep learning is a subset of machine learning that uses deep neural network to train a machine to learn features directly from data. Deep learning realizes many machine learning applications which expand the field of AI. At the present time, deep learning frameworks have been widely deployed on servers for deep learning applications in both academia and industry. In training deep neural networks, there are many standard processes or algorithms, but the performance of different frameworks might be different. In this paper we evaluate the running performance of two state-of-the-art distributed deep learning frameworks that are running training calculation in parallel over multi GPU and multi nodes in our cloud environment. We evaluate the training performance of the frameworks with ResNet-50 convolutional neural network, and we analyze what factors that result in the performance among both distributed frameworks as well. Through the experimental analysis, we identify the overheads which could be further optimized. The main contribution is that the evaluation results provide further optimization directions in both performance tuning and algorithmic design.Keywords: artificial intelligence, machine learning, deep learning, convolutional neural networks
Procedia PDF Downloads 2131839 Oxidative Stress Markers in Sports Related to Training
Authors: V. Antevska, B. Dejanova, L. Todorovska, J. Pluncevic, E. Sivevska, S. Petrovska, S. Mancevska, I. Karagjozova
Abstract:
Introduction: The aim of this study was to optimise the laboratory oxidative stress (OS) markers in soccer players. Material and methods: In a number of 37 soccer players (21±3 years old) and 25 control subjects (sedenters), plasma samples were taken for d-ROMs (reactive oxygen metabolites) and NO (nitric oxide) determination. The d-ROMs test was performed by measurement of hydroperoxide levels (Diacron, Italy). For NO determination the method of nitrate enzyme reduction with the Greiss reagent was used (OXIS, USA). The parameters were taken after the training of the soccer players and were compared with the control group. Training was considered as maximal exercise treadmill test. The criteria of maximum loading for each subject was established as >95% maximal heart rate. Results: The level of d-ROMs was found to be increased in the soccer players vs. control group but no significant difference was noticed. After the training d-ROMs in soccer players showed increased value of 299±44 UCarr (p<0.05). NO showed increased level in all soccer players vs. controls but significant difference was found after the training 102±29 μmol (p<0.05). Conclusion: Due to these results we may suggest that the measuring these OS markers in sport medicine may be useful for better estimation and evaluation of the training program. More oxidative stress should be used to clarify optimization of the training intensity program.Keywords: oxidative stress markers, soccer players, training, sport
Procedia PDF Downloads 4471838 Low Density Parity Check Codes
Authors: Kassoul Ilyes
Abstract:
The field of error correcting codes has been revolutionized by the introduction of iteratively decoded codes. Among these, LDPC codes are now a preferred solution thanks to their remarkable performance and low complexity. The binary version of LDPC codes showed even better performance, although it’s decoding introduced greater complexity. This thesis studies the performance of binary LDPC codes using simplified weighted decisions. Information is transported between a transmitter and a receiver by digital transmission systems, either by propagating over a radio channel or also by using a transmission medium such as the transmission line. The purpose of the transmission system is then to carry the information from the transmitter to the receiver as reliably as possible. These codes have not generated enough interest within the coding theory community. This forgetfulness will last until the introduction of Turbo-codes and the iterative principle. Then it was proposed to adopt Pearl's Belief Propagation (BP) algorithm for decoding these codes. Subsequently, Luby introduced irregular LDPC codes characterized by a parity check matrix. And finally, we study simplifications on binary LDPC codes. Thus, we propose a method to make the exact calculation of the APP simpler. This method leads to simplifying the implementation of the system.Keywords: LDPC, parity check matrix, 5G, BER, SNR
Procedia PDF Downloads 1561837 Nursing Experience in the Intensive Care of a Lung Cancer Patient with Pulmonary Embolism on Extracorporeal Membrane Oxygenation
Authors: Huang Wei-Yi
Abstract:
Objective: This article explores the intensive care nursing experience of a lung cancer patient with pulmonary embolism who was placed on ECMO. Following a sudden change in the patient’s condition and a consensus reached during a family meeting, the decision was made to withdraw life-sustaining equipment and collaborate with the palliative care team. Methods: The nursing period was from October 20 to October 27, 2023. The author monitored physiological data, observed, provided direct care, conducted interviews, performed physical assessments, and reviewed medical records. Together with the critical care team and bypass personnel, a comprehensive assessment was conducted using Gordon's Eleven Functional Health Patterns to identify the patient’s health issues, which included pain related to lung cancer and invasive devices, fear of death due to sudden deterioration, and altered tissue perfusion related to hemodynamic instability. Results: The patient was admitted with fever, back pain, and painful urination. During hospitalization, the patient experienced sudden discomfort followed by cardiac arrest, requiring multiple CPR attempts and ECMO placement. A subsequent CT angiogram revealed a pulmonary embolism. The patient's condition was further complicated by severe pain due to compression fractures, and a diagnosis of terminal lung cancer was unexpectedly confirmed, leading to emotional distress and uncertainty about future treatment. Throughout the critical care process, ECMO was removed on October 24, stabilizing the patient’s body temperature between 36.5-37°C and maintaining a mean arterial pressure of 60-80 mmHg. Pain management, including Morphine 8mg in 0.9% N/S 100ml IV drip q6h PRN and Ultracet 37.5 mg/325 mg 1# PO q6h, kept the pain level below 3. The patient was transferred to the ward on October 27 and discharged home on October 30. Conclusion: During the care period, collaboration with the medical team and palliative care professionals was crucial. Adjustments to pain medication, symptom management, and lung cancer-targeted therapy improved the patient’s physical discomfort and pain levels. By applying the unique functions of nursing and the four principles of palliative care, positive encouragement was provided. Family members, along with social workers, clergy, psychologists, and nutritionists, participated in cross-disciplinary care, alleviating anxiety and fear. The consensus to withdraw ECMO and life-sustaining equipment enabled the patient and family to receive high-quality care and maintain autonomy in decision-making. A follow-up call on November 1 confirmed that the patient was emotionally stable, pain-free, and continuing with targeted lung cancer therapy.Keywords: intensive care, lung cancer, pulmonary embolism, ECMO
Procedia PDF Downloads 321836 A Scalable Model of Fair Socioeconomic Relations Based on Blockchain and Machine Learning Algorithms-1: On Hyperinteraction and Intuition
Authors: Merey M. Sarsengeldin, Alexandr S. Kolokhmatov, Galiya Seidaliyeva, Alexandr Ozerov, Sanim T. Imatayeva
Abstract:
This series of interdisciplinary studies is an attempt to investigate and develop a scalable model of fair socioeconomic relations on the base of blockchain using positive psychology techniques and Machine Learning algorithms for data analytics. In this particular study, we use hyperinteraction approach and intuition to investigate their influence on 'wisdom of crowds' via created mobile application which was created for the purpose of this research. Along with the public blockchain and private Decentralized Autonomous Organization (DAO) which were elaborated by us on the base of Ethereum blockchain, a model of fair financial relations of members of DAO was developed. We developed a smart contract, so-called, Fair Price Protocol and use it for implementation of model. The data obtained from mobile application was analyzed by ML algorithms. A model was tested on football matches.Keywords: blockchain, Naïve Bayes algorithm, hyperinteraction, intuition, wisdom of crowd, decentralized autonomous organization
Procedia PDF Downloads 1721835 Clustering Performance Analysis using New Correlation-Based Cluster Validity Indices
Authors: Nathakhun Wiroonsri
Abstract:
There are various cluster validity measures used for evaluating clustering results. One of the main objectives of using these measures is to seek the optimal unknown number of clusters. Some measures work well for clusters with different densities, sizes and shapes. Yet, one of the weaknesses that those validity measures share is that they sometimes provide only one clear optimal number of clusters. That number is actually unknown and there might be more than one potential sub-optimal option that a user may wish to choose based on different applications. We develop two new cluster validity indices based on a correlation between an actual distance between a pair of data points and a centroid distance of clusters that the two points are located in. Our proposed indices constantly yield several peaks at different numbers of clusters which overcome the weakness previously stated. Furthermore, the introduced correlation can also be used for evaluating the quality of a selected clustering result. Several experiments in different scenarios, including the well-known iris data set and a real-world marketing application, have been conducted to compare the proposed validity indices with several well-known ones.Keywords: clustering algorithm, cluster validity measure, correlation, data partitions, iris data set, marketing, pattern recognition
Procedia PDF Downloads 1041834 Despiking of Turbulent Flow Data in Gravel Bed Stream
Authors: Ratul Das
Abstract:
The present experimental study insights the decontamination of instantaneous velocity fluctuations captured by Acoustic Doppler Velocimeter (ADV) in gravel-bed streams to ascertain near-bed turbulence for low Reynolds number. The interference between incidental and reflected pulses produce spikes in the ADV data especially in the near-bed flow zone and therefore filtering the data are very essential. Nortek’s Vectrino four-receiver ADV probe was used to capture the instantaneous three-dimensional velocity fluctuations over a non-cohesive bed. A spike removal algorithm based on the acceleration threshold method was applied to note the bed roughness and its influence on velocity fluctuations and velocity power spectra in the carrier fluid. The velocity power spectra of despiked signals with a best combination of velocity threshold (VT) and acceleration threshold (AT) are proposed which ascertained velocity power spectra a satisfactory fit with the Kolmogorov “–5/3 scaling-law” in the inertial sub-range. Also, velocity distributions below the roughness crest level fairly follows a third-degree polynomial series.Keywords: acoustic doppler velocimeter, gravel-bed, spike removal, reynolds shear stress, near-bed turbulence, velocity power spectra
Procedia PDF Downloads 302