Search results for: heuristic procedures
1568 Automatic Intelligent Analysis of Malware Behaviour
Authors: Hermann Dornhackl, Konstantin Kadletz, Robert Luh, Paul Tavolato
Abstract:
In this paper we describe the use of formal methods to model malware behaviour. The modelling of harmful behaviour rests upon syntactic structures that represent malicious procedures inside malware. The malicious activities are modelled by a formal grammar, where API calls’ components are the terminals and the set of API calls used in combination to achieve a goal are designated non-terminals. The combination of different non-terminals in various ways and tiers make up the attack vectors that are used by harmful software. Based on these syntactic structures a parser can be generated which takes execution traces as input for pattern recognition.Keywords: malware behaviour, modelling, parsing, search, pattern matching
Procedia PDF Downloads 3321567 Non-Convex Multi Objective Economic Dispatch Using Ramp Rate Biogeography Based Optimization
Authors: Susanta Kumar Gachhayat, S. K. Dash
Abstract:
Multi objective non-convex economic dispatch problems of a thermal power plant are of grave concern for deciding the cost of generation and reduction of emission level for diminishing the global warming level for improving green-house effect. This paper deals with ramp rate constraints for achieving better inequality constraints so as to incorporate valve point loading for cost of generation in thermal power plant through ramp rate biogeography based optimization involving mutation and migration. Through 50 out of 100 trials, the cost function and emission objective function were found to have outperformed other classical methods such as lambda iteration method, quadratic programming method and many heuristic methods like particle swarm optimization method, weight improved particle swarm optimization method, constriction factor based particle swarm optimization method, moderate random particle swarm optimization method etc. Ramp rate biogeography based optimization applications prove quite advantageous in solving non convex multi objective economic dispatch problems subjected to nonlinear loads that pollute the source giving rise to third harmonic distortions and other such disturbances.Keywords: economic load dispatch, ELD, biogeography-based optimization, BBO, ramp rate biogeography-based optimization, RRBBO, valve-point loading, VPL
Procedia PDF Downloads 3791566 Comparison of Instantaneous Short Circuit versus Step DC Voltage to Determine PMG Inductances
Authors: Walter Evaldo Kuchenbecker, Julio Carlos Teixeira
Abstract:
Since efficiency became a challenge to reduce energy consumption of all electrical machines applications, the permanent magnet machine raises up as a better option, because its performance, robustness and simple control. Even though, the electrical machine was developed through analyses of magnetism effect, permanent magnet machines still not well dominated. As permanent magnet machines are becoming popular in most applications, the pressure to standardize this type of electrical machine increases. However, due limited domain, it is still nowadays without any standard to manufacture, test and application. In order to determine an inductance of the machine, a new method is proposed.Keywords: permanent magnet generators (pmg), synchronous machine parameters, test procedures, inductances
Procedia PDF Downloads 3031565 Anthropometric Indices of Obesity and Coronary Artery Atherosclerosis: An Autopsy Study in South Indian population
Authors: Francis Nanda Prakash Monteiro, Shyna Quadras, Tanush Shetty
Abstract:
The association between human physique and morbidity and mortality resulting from coronary artery disease has been studied extensively over several decades. Multiple studies have also been done on the correlation between grade of atherosclerosis, coronary artery diseases and anthropometrical measurements. However, the number of autopsy-based studies drastically reduces this number. It has been suggested that while in living subjects, it would be expensive, difficult, and even harmful to subject them to imaging modalities like CT scans and procedures involving contrast media to study mild atherosclerosis, no such harm is encountered in study of autopsy cases. This autopsy-based study was aimed to correlate the anthropometric measurements and indices of obesity, such as waist circumference (WC), hip circumference (HC), body mass index (BMI) and waist hip ratio (WHR) with the degree of atherosclerosis in the right coronary artery (RCA), main branch of the left coronary artery (LCA) and the left anterior descending artery (LADA) in 95 South Indian origin victims of both the genders between the age of 18 years and 75 years. The grading of atherosclerosis was done according to criteria suggested by the American Heart Association. The study also analysed the correlation of the anthropometric measurements and indices of obesity with the number of coronaries affected with atherosclerosis in an individual. All the anthropometric measurements and the derived indices were found to be significantly correlated to each other in both the genders except for the age, which is found to have a significant correlation only with the WHR. In both the genders severe degree of atherosclerosis was commonly observed in LADA, followed by LCA and RCA. Grade of atherosclerosis in RCA is significantly related to the WHR in males. Grade of atherosclerosis in LCA and LADA is significantly related to the WHR in females. Significant relation was observed between grade of atherosclerosis in RCA and WC, and WHR, and between grade of atherosclerosis in LADA and HC in males. Significant relation was observed between grade of atherosclerosis in RCA and WC, and WHR, and between grade of atherosclerosis in LADA and HC in females. Anthropometric measurements/indices of obesity can be an effective means to identify high risk cases of atherosclerosis at an early stage that can be effective in reducing the associated cardiac morbidity and mortality. A person with anthropometric measurements suggestive of mild atherosclerosis can be advised to modify his lifestyle, along with decreasing his exposure to the other risk factors. Those with measurements suggestive of higher degree of atherosclerosis can be subjected to confirmatory procedures to start effective treatment.Keywords: atherosclerosis, coronary artery disease, indices, obesity
Procedia PDF Downloads 661564 Hybrid Knowledge and Data-Driven Neural Networks for Diffuse Optical Tomography Reconstruction in Medical Imaging
Authors: Paola Causin, Andrea Aspri, Alessandro Benfenati
Abstract:
Diffuse Optical Tomography (DOT) is an emergent medical imaging technique which employs NIR light to estimate the spatial distribution of optical coefficients in biological tissues for diagnostic purposes, in a noninvasive and non-ionizing manner. DOT reconstruction is a severely ill-conditioned problem due to prevalent scattering of light in the tissue. In this contribution, we present our research in adopting hybrid knowledgedriven/data-driven approaches which exploit the existence of well assessed physical models and build upon them neural networks integrating the availability of data. Namely, since in this context regularization procedures are mandatory to obtain a reasonable reconstruction [1], we explore the use of neural networks as tools to include prior information on the solution. 2. Materials and Methods The idea underlying our approach is to leverage neural networks to solve PDE-constrained inverse problems of the form 𝒒 ∗ = 𝒂𝒓𝒈 𝒎𝒊𝒏𝒒 𝐃(𝒚, 𝒚̃), (1) where D is a loss function which typically contains a discrepancy measure (or data fidelity) term plus other possible ad-hoc designed terms enforcing specific constraints. In the context of inverse problems like (1), one seeks the optimal set of physical parameters q, given the set of observations y. Moreover, 𝑦̃ is the computable approximation of y, which may be as well obtained from a neural network but also in a classic way via the resolution of a PDE with given input coefficients (forward problem, Fig.1 box ). Due to the severe ill conditioning of the reconstruction problem, we adopt a two-fold approach: i) we restrict the solutions (optical coefficients) to lie in a lower-dimensional subspace generated by auto-decoder type networks. This procedure forms priors of the solution (Fig.1 box ); ii) we use regularization procedures of type 𝒒̂ ∗ = 𝒂𝒓𝒈𝒎𝒊𝒏𝒒 𝐃(𝒚, 𝒚̃)+ 𝑹(𝒒), where 𝑹(𝒒) is a regularization functional depending on regularization parameters which can be fixed a-priori or learned via a neural network in a data-driven modality. To further improve the generalizability of the proposed framework, we also infuse physics knowledge via soft penalty constraints (Fig.1 box ) in the overall optimization procedure (Fig.1 box ). 3. Discussion and Conclusion DOT reconstruction is severely hindered by ill-conditioning. The combined use of data-driven and knowledgedriven elements is beneficial and allows to obtain improved results, especially with a restricted dataset and in presence of variable sources of noise.Keywords: inverse problem in tomography, deep learning, diffuse optical tomography, regularization
Procedia PDF Downloads 741563 Exploring Counting Methods for the Vertices of Certain Polyhedra with Uncertainties
Authors: Sammani Danwawu Abdullahi
Abstract:
Vertex Enumeration Algorithms explore the methods and procedures of generating the vertices of general polyhedra formed by system of equations or inequalities. These problems of enumerating the extreme points (vertices) of general polyhedra are shown to be NP-Hard. This lead to exploring how to count the vertices of general polyhedra without listing them. This is also shown to be #P-Complete. Some fully polynomial randomized approximation schemes (fpras) of counting the vertices of some special classes of polyhedra associated with Down-Sets, Independent Sets, 2-Knapsack problems and 2 x n transportation problems are presented together with some discovered open problems.Keywords: counting with uncertainties, mathematical programming, optimization, vertex enumeration
Procedia PDF Downloads 3571562 Efficient Design of Distribution Logistics by Using a Model-Based Decision Support System
Abstract:
The design of distribution logistics has a decisive impact on a company's logistics costs and performance. Hence, such solutions make an essential contribution to corporate success. This article describes a decision support system for analyzing the potential of distribution logistics in terms of logistics costs and performance. In contrast to previous procedures of business process re-engineering (BPR), this method maps distribution logistics holistically under variable distribution structures. Combined with qualitative measures the decision support system will contribute to a more efficient design of distribution logistics.Keywords: decision support system, distribution logistics, potential analyses, supply chain management
Procedia PDF Downloads 4061561 Biogeography Based CO2 and Cost Optimization of RC Cantilever Retaining Walls
Authors: Ibrahim Aydogdu, Alper Akin
Abstract:
In this study, the development of minimizing the cost and the CO2 emission of the RC retaining wall design has been performed by Biogeography Based Optimization (BBO) algorithm. This has been achieved by developing computer programs utilizing BBO algorithm which minimize the cost and the CO2 emission of the RC retaining walls. Objective functions of the optimization problem are defined as the minimized cost, the CO2 emission and weighted aggregate of the cost and the CO2 functions of the RC retaining walls. In the formulation of the optimum design problem, the height and thickness of the stem, the length of the toe projection, the thickness of the stem at base level, the length and thickness of the base, the depth and thickness of the key, the distance from the toe to the key, the number and diameter of the reinforcement bars are treated as design variables. In the formulation of the optimization problem, flexural and shear strength constraints and minimum/maximum limitations for the reinforcement bar areas are derived from American Concrete Institute (ACI 318-14) design code. Moreover, the development length conditions for suitable detailing of reinforcement are treated as a constraint. The obtained optimum designs must satisfy the factor of safety for failure modes (overturning, sliding and bearing), strength, serviceability and other required limitations to attain practically acceptable shapes. To demonstrate the efficiency and robustness of the presented BBO algorithm, the optimum design example for retaining walls is presented and the results are compared to the previously obtained results available in the literature.Keywords: bio geography, meta-heuristic search, optimization, retaining wall
Procedia PDF Downloads 3971560 Test Method Development for Evaluation of Process and Design Effect on Reinforced Tube
Authors: Cathal Merz, Gareth O’Donnell
Abstract:
Coil reinforced thin-walled (CRTW) tubes are used in medicine to treat problems affecting blood vessels within the body through minimally invasive procedures. The CRTW tube considered in this research makes up part of such a device and is inserted into the patient via their femoral or brachial arteries and manually navigated to the site in need of treatment. This procedure replaces the requirement to perform open surgery but is limited by reduction of blood vessel lumen diameter and increase in tortuosity of blood vessels deep in the brain. In order to maximize the capability of these procedures, CRTW tube devices are being manufactured with decreasing wall thicknesses in order to deliver treatment deeper into the body and to allow passage of other devices through its inner diameter. This introduces significant stresses to the device materials which have resulted in an observed increase in the breaking of the proximal segment of the device into two separate pieces after it has failed by buckling. As there is currently no international standard for measuring the mechanical properties of these CRTW tube devices, it is difficult to accurately analyze this problem. The aim of the current work is to address this discrepancy in the biomedical device industry by developing a measurement system that can be used to quantify the effect of process and design changes on CRTW tube performance, aiding in the development of better performing, next generation devices. Using materials testing frames, micro-computed tomography (micro-CT) imaging, experiment planning, analysis of variance (ANOVA), T-tests and regression analysis, test methods have been developed for assessing the impact of process and design changes on the device. The major findings of this study have been an insight into the suitability of buckle and three-point bend tests for the measurement of the effect of varying processing factors on the device’s performance, and guidelines for interpreting the output data from the test methods. The findings of this study are of significant interest with respect to verifying and validating key process and design changes associated with the device structure and material condition. Test method integrity evaluation is explored throughout.Keywords: neurovascular catheter, coil reinforced tube, buckling, three-point bend, tensile
Procedia PDF Downloads 1171559 How to Modernise the European Competition Network (ECN)
Authors: Dorota Galeza
Abstract:
This paper argues that networks, such as the ECN and the American network, are affected by certain small events which are inherent to path dependence and preclude the full evolution towards efficiency. It is advocated that the American network is superior to the ECN in many respects due to its greater flexibility and longer history. This stems in particular from the creation of the American network, which was based on a small number of cases. Such a structure encourages further changes and modifications which are not necessarily radical. The ECN, by contrast, was established by legislative action, which explains its rigid structure and resistance to change. This paper is an attempt to transpose the superiority of the American network on to the ECN. It looks at concepts such as judicial cooperation, harmonisation of procedure, peer review and regulatory impact assessments (RIAs), and dispute resolution procedures.Keywords: antitrust, competition, networks, path dependence
Procedia PDF Downloads 3151558 Development of Risk-Based Ambient Air Quality Standards in the Russian Federation on the Basis of Risk Assessment Procedures Harmonized with International Approaches
Authors: Nina V. Zaitseva, Pavel Z. Shur, Nina G. Atiskova
Abstract:
Nowadays harmonization of sanitary and hygienic standards of environmental quality with international standards is crucial part of integration of Russia into the international community. Harmonization of Russian and international ambient air quality standards may be realized by risk-based standards development. In this paper approaches to risk-based standards development and examples of these approaches implementation are presented.Keywords: harmonization, health risk assessment, evolutionary modelling, benchmark level, nickel, manganese
Procedia PDF Downloads 3901557 The Influence of Republican Culture in the Professional Education Reforms in Brazil (1892-1930)
Authors: Milene Magalhães Pinto, Irlen Antônio Gonçalves
Abstract:
This paper is within the area of History of Education in Brazil, having a descriptive and exploratory nature. It has been built on the belief that professional education is organized under political guidelines and solidifies through institutionalized discourses, allowing to know its mission concerning the society in which it operates by studying these speeches. Our purpose is to analyze how the Republican political culture yielded changes in public education through reforms to professional education in the First Republic, based on seven procedures of law that occurred in the Legislature of State of Minas Gerais. The Republican effort to reform the teaching was the result of a conception of society that aspired to advance the country by way of the national worker.Keywords: professional education, republican political culture, education reforms, Brazil
Procedia PDF Downloads 4951556 Algorithms Utilizing Wavelet to Solve Various Partial Differential Equations
Authors: K. P. Mredula, D. C. Vakaskar
Abstract:
The article traces developments and evolution of various algorithms developed for solving partial differential equations using the significant combination of wavelet with few already explored solution procedures. The approach depicts a study over a decade of traces and remarks on the modifications in implementing multi-resolution of wavelet, finite difference approach, finite element method and finite volume in dealing with a variety of partial differential equations in the areas like plasma physics, astrophysics, shallow water models, modified Burger equations used in optical fibers, biology, fluid dynamics, chemical kinetics etc.Keywords: multi-resolution, Haar Wavelet, partial differential equation, numerical methods
Procedia PDF Downloads 2991555 Fracture Toughness Characterizations of Single Edge Notch (SENB) Testing Using DIC System
Authors: Amr Mohamadien, Ali Imanpour, Sylvester Agbo, Nader Yoosef-Ghodsi, Samer Adeeb
Abstract:
The fracture toughness resistance curve (e.g., J-R curve and crack tip opening displacement (CTOD) or δ-R curve) is important in facilitating strain-based design and integrity assessment of oil and gas pipelines. This paper aims to present laboratory experimental data to characterize the fracture behavior of pipeline steel. The influential parameters associated with the fracture of API 5L X52 pipeline steel, including different initial crack sizes, were experimentally investigated for a single notch edge bend (SENB). A total of 9 small-scale specimens with different crack length to specimen depth ratios were conducted and tested using single edge notch bending (SENB). ASTM E1820 and BS7448 provide testing procedures to construct the fracture resistance curve (Load-CTOD, CTOD-R, or J-R) from test results. However, these procedures are limited by standard specimens’ dimensions, displacement gauges, and calibration curves. To overcome these limitations, this paper presents the use of small-scale specimens and a 3D-digital image correlation (DIC) system to extract the parameters required for fracture toughness estimation. Fracture resistance curve parameters in terms of crack mouth open displacement (CMOD), crack tip opening displacement (CTOD), and crack growth length (∆a) were carried out from test results by utilizing the DIC system, and an improved regression fitting resistance function (CTOD Vs. crack growth), or (J-integral Vs. crack growth) that is dependent on a variety of initial crack sizes was constructed and presented. The obtained results were compared to the available results of the classical physical measurement techniques, and acceptable matchings were observed. Moreover, a case study was implemented to estimate the maximum strain value that initiates the stable crack growth. This might be of interest to developing more accurate strain-based damage models. The results of laboratory testing in this study offer a valuable database to develop and validate damage models that are able to predict crack propagation of pipeline steel, accounting for the influential parameters associated with fracture toughness.Keywords: fracture toughness, crack propagation in pipeline steels, CTOD-R, strain-based damage model
Procedia PDF Downloads 631554 Intersubjectivity of Forensic Handwriting Analysis
Authors: Marta Nawrocka
Abstract:
In each of the legal proceedings, in which expert evidence is carried out, a major concern is the assessment of the evidential value of expert reports. Judicial institutions, while making decisions, rely heavily on the expert reports, because they usually do not possess 'special knowledge' from a certain fields of science which makes it impossible for them to verify the results presented in the processes. In handwriting studies, the standards of analysis are developed. They unify procedures used by experts in comparing signs and in constructing expert reports. However, the methods used by experts are usually of a qualitative nature. They rely on the application of knowledge and experience of expert and in effect give significant range of margin in the assessment. Moreover, the standards used by experts are still not very precise and the process of reaching the conclusions is poorly understood. The above-mentioned circumstances indicate that expert opinions in the field of handwriting analysis, for many reasons, may not be sufficiently reliable. It is assumed that this state of affairs has its source in a very low level of intersubjectivity of measuring scales and analysis procedures, which consist elements of this kind of analysis. Intersubjectivity is a feature of cognition which (in relation to methods) indicates the degree of consistency of results that different people receive using the same method. The higher the level of intersubjectivity is, the more reliable and credible the method can be considered. The aim of the conducted research was to determine the degree of intersubjectivity of the methods used by the experts from the scope of handwriting analysis. 30 experts took part in the study and each of them received two signatures, with varying degrees of readability, for analysis. Their task was to distinguish graphic characteristics in the signature, estimate the evidential value of the found characteristics and estimate the evidential value of the signature. The obtained results were compared with each other using the Alpha Krippendorff’s statistic, which numerically determines the degree of compatibility of the results (assessments) that different people receive under the same conditions using the same method. The estimation of the degree of compatibility of the experts' results for each of these tasks allowed to determine the degree of intersubjectivity of the studied method. The study showed that during the analysis, the experts identified different signature characteristics and attributed different evidential value to them. In this scope, intersubjectivity turned out to be low. In addition, it turned out that experts in various ways called and described the same characteristics, and the language used was often inconsistent and imprecise. Thus, significant differences have been noted on the basis of language and applied nomenclature. On the other hand, experts attributed a similar evidential value to the entire signature (set of characteristics), which indicates that in this range, they were relatively consistent.Keywords: forensic sciences experts, handwriting analysis, inter-rater reliability, reliability of methods
Procedia PDF Downloads 1491553 A Study of Issues and Mitigations on Distributed Denial of Service and Medical Internet of Things Devices
Authors: Robin Singh, Jing-Chiou Liou
Abstract:
The Internet of Things (IoT) devices are being used heavily as part of our everyday routines. Through improved communication and automated procedures, its popularity has assisted users in raising the quality of work. These devices are used in healthcare in order to better collect the patient’s data for their treatment. They are generally considered safe and secure. However, there is some possibility that some loopholes do exist which manufacturers do need to identify before some hacker takes advantage of them. For this study, we focused on two medical IoT devices which are pacemakers and hearing aids. The aim of this paper is to identify if there is any likelihood of these medical devices being hijacked and used as a botnet in Distributed Denial-Of Service attacks. Moreover, some mitigation strategies are being proposed to better secureKeywords: cybersecurity, DDoS, IoT, medical devices
Procedia PDF Downloads 861552 Management Information System to Help Managers for Providing Decision Making in an Organization
Authors: Ajayi Oluwasola Felix
Abstract:
Management information system (MIS) provides information for the managerial activities in an organization. The main purpose of this research is, MIS provides accurate and timely information necessary to facilitate the decision-making process and enable the organizations planning control and operational functions to be carried out effectively. Management information system (MIS) is basically concerned with processing data into information and is then communicated to the various departments in an organization for appropriate decision-making. MIS is a subset of the overall planning and control activities covering the application of humans technologies, and procedures of the organization. The information system is the mechanism to ensure that information is available to the managers in the form they want it and when they need it.Keywords: Management Information Systems (MIS), information technology, decision-making, MIS in Organizations
Procedia PDF Downloads 5561551 Co-Evolutionary Fruit Fly Optimization Algorithm and Firefly Algorithm for Solving Unconstrained Optimization Problems
Authors: R. M. Rizk-Allah
Abstract:
This paper presents co-evolutionary fruit fly optimization algorithm based on firefly algorithm (CFOA-FA) for solving unconstrained optimization problems. The proposed algorithm integrates the merits of fruit fly optimization algorithm (FOA), firefly algorithm (FA) and elite strategy to refine the performance of classical FOA. Moreover, co-evolutionary mechanism is performed by applying FA procedures to ensure the diversity of the swarm. Finally, the proposed algorithm CFOA- FA is tested on several benchmark problems from the usual literature and the numerical results have demonstrated the superiority of the proposed algorithm for finding the global optimal solution.Keywords: firefly algorithm, fruit fly optimization algorithm, unconstrained optimization problems
Procedia PDF Downloads 5361550 Design of Digital IIR Filter Using Opposition Learning and Artificial Bee Colony Algorithm
Authors: J. S. Dhillon, K. K. Dhaliwal
Abstract:
In almost all the digital filtering applications the digital infinite impulse response (IIR) filters are preferred over finite impulse response (FIR) filters because they provide much better performance, less computational cost and have smaller memory requirements for similar magnitude specifications. However, the digital IIR filters are generally multimodal with respect to the filter coefficients and therefore, reliable methods that can provide global optimal solutions are required. The artificial bee colony (ABC) algorithm is one such recently introduced meta-heuristic optimization algorithm. But in some cases it shows insufficiency while searching the solution space resulting in a weak exchange of information and hence is not able to return better solutions. To overcome this deficiency, the opposition based learning strategy is incorporated in ABC and hence a modified version called oppositional artificial bee colony (OABC) algorithm is proposed in this paper. Duplication of members is avoided during the run which also augments the exploration ability. The developed algorithm is then applied for the design of optimal and stable digital IIR filter structure where design of low-pass (LP) and high-pass (HP) filters is carried out. Fuzzy theory is applied to achieve maximize satisfaction of minimum magnitude error and stability constraints. To check the effectiveness of OABC, the results are compared with some well established filter design techniques and it is observed that in most cases OABC returns better or atleast comparable results.Keywords: digital infinite impulse response filter, artificial bee colony optimization, opposition based learning, digital filter design, multi-parameter optimization
Procedia PDF Downloads 4771549 Residual Life Estimation of K-out-of-N Cold Standby System
Authors: Qian Zhao, Shi-Qi Liu, Bo Guo, Zhi-Jun Cheng, Xiao-Yue Wu
Abstract:
Cold standby redundancy is considered to be an effective mechanism for improving system reliability and is widely used in industrial engineering. However, because of the complexity of the reliability structure, there is little literature studying on the residual life of cold standby system consisting of complex components. In this paper, a simulation method is presented to predict the residual life of k-out-of-n cold standby system. In practical cases, failure information of a system is either unknown, partly unknown or completely known. Our proposed method is designed to deal with the three scenarios, respectively. Differences between the procedures are analyzed. Finally, numerical examples are used to validate the proposed simulation method.Keywords: cold standby system, k-out-of-n, residual life, simulation sampling
Procedia PDF Downloads 4011548 Thermal Radiation and Chemical Reaction Effects on MHD Casson Fluid Past a Permeable Stretching Sheet in a Porous Medium
Authors: Y. Sunita Rani, Y. Hari Krishna, M. V. Ramana Murthy, K. Sudhaker Reddy
Abstract:
This article studied effects of radiation and chemical reaction on MHD casson fluoid flow past a Permeable Stretching Sheet in a Porous Medium. Suitable transformations are considered to transform the governing partial differential equations as ordinary ones and then solved by the numerical procedures like Runge- Kutta – Fehlberg shooting technique method. The effects of various governing parameters, on the velocity, temperature and concentration are displayed through graphs and discussed numerically.Keywords: MHD, Casson fluid, porous medium, permeable stretching sheet
Procedia PDF Downloads 1251547 Distributed System Computing Resource Scheduling Algorithm Based on Deep Reinforcement Learning
Authors: Yitao Lei, Xingxiang Zhai, Burra Venkata Durga Kumar
Abstract:
As the quantity and complexity of computing in large-scale software systems increase, distributed system computing becomes increasingly important. The distributed system realizes high-performance computing by collaboration between different computing resources. If there are no efficient resource scheduling resources, the abuse of distributed computing may cause resource waste and high costs. However, resource scheduling is usually an NP-hard problem, so we cannot find a general solution. However, some optimization algorithms exist like genetic algorithm, ant colony optimization, etc. The large scale of distributed systems makes this traditional optimization algorithm challenging to work with. Heuristic and machine learning algorithms are usually applied in this situation to ease the computing load. As a result, we do a review of traditional resource scheduling optimization algorithms and try to introduce a deep reinforcement learning method that utilizes the perceptual ability of neural networks and the decision-making ability of reinforcement learning. Using the machine learning method, we try to find important factors that influence the performance of distributed system computing and help the distributed system do an efficient computing resource scheduling. This paper surveys the application of deep reinforcement learning on distributed system computing resource scheduling proposes a deep reinforcement learning method that uses a recurrent neural network to optimize the resource scheduling, and proposes the challenges and improvement directions for DRL-based resource scheduling algorithms.Keywords: resource scheduling, deep reinforcement learning, distributed system, artificial intelligence
Procedia PDF Downloads 1111546 Effect of Term of Preparation on Performance of Cool Chamber Stored White Poplar Hardwood Cuttings in Nursery
Authors: Branislav Kovačević, Andrej Pilipović, Zoran Novčić, Marina Milović, Lazar Kesić, Milan Drekić, Saša Pekeč, Leopold Poljaković Pajnik, Saša Orlović
Abstract:
Poplars present one of the most important tree species used for phytoremediation in the northern hemisphere. They can be used either as direct “cleaners” of the contaminated soils or as buffer zones preventing the contaminant plume to the surrounding environment. In order to produce appropriate planting material for this purpose, there is a long process of the breeding of the most favorable candidates. Although the development of the poplar propagation technology has been evolving for decades, white poplar nursery production, as well as the establishment of short-rotation coppice plantations, still considerably depends on the success of hardwood cuttings’ survival. This is why easy rooting is among the most desirable properties in white poplar breeding. On the other hand, there are many opportunities for the optimization of the technological procedures in order to meet the demands of particular genotype (clonal technology). In this study the effect of the term of hardwood cuttings’ preparation of four white poplar clones on their survival and further growth of rooted cuttings in nursery conditions were tested. There were three terms of cuttings’ preparation: the beginning of February (2nd Feb 2023), the beginning of March (3rd Mar 2023) and the end of March (21nd Mar 2023), which is regarded as the standard term. The cuttings were stored in cool chamber at 2±2°C. All cuttings were planted on the same date (11th Apr 2023), in soil prepared with rotary tillage, and then cultivated by usual nursey procedures. According to the results obtained after the bud set (29th Sept 2023) there were significant differences in the survival and growth of rooted cuttings between examined terms of cutting preparation. Also, there were significant differences in the reaction of examined clones on terms of cutting preparation. In total, the best results provided cuttings prepared at the first term (2nd Feb 2023) (survival rate of 39.4%), while performance after two later preparation terms was significantly poorer (20.5% after second and 16.5% after third term). These results stress the significance of dormancy preservation in cuttings of examined white poplar clones for their survival, which could be especially important in context of climate change. Differences in clones’ reaction to term of cutting preparation suggest necessity of adjustment of the technology to the needs of particular clone i.e. design of clone specific technology.Keywords: rooting, Populus alba, nursery, clonal technology
Procedia PDF Downloads 641545 Electromyographic Analysis of Biceps Brachii during Golf Swing and Review of Its Impact on Return to Play Following Tendon Surgery
Authors: Amin Masoumiganjgah, Luke Salmon, Julianne Burnton, Fahimeh Bagheri, Gavin Lenton, S. L. Ezekial Tan
Abstract:
Introduction: The incidence of proximal biceps tenodesis and acute distal biceps repair is increasing, and rehabilitation protocols following both are variable. Golf is a popular sport within Australia, and the Gold Coast has become a Mecca for golfers, with more courses per capita than anywhere else in the world. Currently, there are no clear guidelines regarding return to golf play following biceps procedures. The aim of this study was to determine biceps brachii activation during the golf swing through electromyographic analysis, and subsequently, aid in rehabilitation guidelines and return to golf following tenodesis and repair. Methods: Subjects were amateur golfers with no previous upper limb surgery. Surface electromyography (EMG) and high-speed video recording were used to analyse activation of the left and right biceps brachii and the anterior deltoid during the golf swing. Each participant’s maximum voluntary contraction (MVC) was recorded, and they were then required to hit a golf ball aiming for specific distances of 2, 50, 100 and 150 metres at a driving range. Noraxon myoResearch and Matlab were used for data analysis. Mean % MVC was calculated for leading and trailing arms during the full swing and its’ 4 phases: back-swing, acceleration, early follow-through and late follow-through. Results: 12 golfers (2 female and 10 male), participated in the study. Median age was 27 (25 – 38), with all being right handed. Over all distances, the mean activation of the short and long head of biceps brachii was < 10% through the full swing. When breaking down the 50, 100 and 150m swing into phases, mean MVC activation was lowest in backswing (5.1%), followed by acceleration (9.7%), early follow-through (9.2%), and late follow-through (21.4%). There was more variation and slightly higher activation in the right biceps (trailing arm) in backswing, acceleration, and early follow-through; with higher activation in the leading arm in late follow-through (25.4% leading, 17.3% trailing). 2m putts resulted in low MVC values (3.1% ) with little variation across swing phases. There was considerable individual variation in results – one tense subject averaged 11.0% biceps MVC through the 2m putting stroke and others recorded peak mean MVC biceps activations of 68.9% at 50m, 101.3% at 100m, and 111.3% at 150m. Discussion: Previous studies have investigated the role of rotator cuff, spine, and hip muscles during the golf swing however, to our knowledge, this is the first study that investigates the activation of biceps brachii. Many rehabilitation programs following a biceps tenodesis or repair allow active range against gravity and restrict strengthening exercises until 6 weeks, and this does not appear to be associated with any adverse outcome. Previous studies demonstrate a range of < 10% MVC is similar to the unloaded biceps brachii during walking(1), active elbow flexion with the hand positioned either in pronation or supination will produce MVC < 20% throughout range(2) and elbow flexion with a 4kg dumbbell can produce mean MVC’s of around 40%(3). Our study demonstrates that increasing activation is associated with the leading arm, increasing shot distance and the late follow-through phase. Although the cohort mean MVC of the biceps brachii is <10% through the full swing, variability is high and biceps activation reach peak mean MVC’s of over 100% in different swing phases for some individuals. Given these EMG values, caution is advised when advising patients post biceps procedures to return to long distance golf shots, particularly when the leading arm is involved. Even though it would appear that putting would be as safe as having an unloaded hand out of a sling following biceps procedures, the variability of activation patterns across different golfers would lead us to caution against accelerated golf rehabilitation in those who may be particularly tense golfers. The 50m short iron shot was too long to consider as a chip shot and more work can be done in this area to determine the safety of chipping.Keywords: electromyographic analysis, biceps brachii rupture, golf swing, tendon surgery
Procedia PDF Downloads 811544 A Development of Science Instructional Model Based on Stem Education Approach to Enhance Scientific Mind and Problem Solving Skills for Primary Students
Authors: Prasita Sooksamran, Wareerat Kaewurai
Abstract:
STEM is an integrated teaching approach promoted by the Ministry of Education in Thailand. STEM Education is an integrated approach to teaching Science, Technology, Engineering, and Mathematics. It has been questioned by Thai teachers on the grounds of how to integrate STEM into the classroom. Therefore, the main objective of this study is to develop a science instructional model based on the STEM approach to enhance scientific mind and problem-solving skills for primary students. This study is participatory action research, and follows the following steps: 1) develop a model 2) seek the advice of experts regarding the teaching model. Developing the instructional model began with the collection and synthesis of information from relevant documents, related research and other sources in order to create prototype instructional model. 2) The examination of the validity and relevance of instructional model by a panel of nine experts. The findings were as follows: 1. The developed instructional model comprised of principles, objective, content, operational procedures and learning evaluation. There were 4 principles: 1) Learning based on the natural curiosity of primary school level children leading to knowledge inquiry, understanding and knowledge construction, 2) Learning based on the interrelation between people and environment, 3) Learning that is based on concrete learning experiences, exploration and the seeking of knowledge, 4) Learning based on the self-construction of knowledge, creativity, innovation and 5) relating their findings to real life and the solving of real-life problems. The objective of this construction model is to enhance scientific mind and problem-solving skills. Children will be evaluated according to their achievements. Lesson content is based on science as a core subject which is integrated with technology and mathematics at grade 6 level according to The Basic Education Core Curriculum 2008 guidelines. The operational procedures consisted of 6 steps: 1) Curiosity 2) Collection of data 3) Collaborative planning 4) Creativity and Innovation 5) Criticism and 6) Communication and Service. The learning evaluation is an authentic assessment based on continuous evaluation of all the material taught. 2. The experts agreed that the Science Instructional Model based on the STEM Education Approach had an excellent level of validity and relevance (4.67 S.D. 0.50).Keywords: instructional model, STEM education, scientific mind, problem solving
Procedia PDF Downloads 1921543 Designing Program for Developing Self-Esteem of Gifted Children
Authors: Mohammad Jamalallail
Abstract:
Self-esteem implies a person’s overall self-worth, self-respect, and self-value. It helps a person to maintain good mental health, personality, and achievement. Gifted students face some emotional problems, sometimes, which cause decreases in their self-esteem. Such emotional problems include loneliness, anxiety, and depression as examples. For this reason, designing a counseling program is necessary for gifted students who need a high level of self-esteem. The available counseling programs focused on developmental aspect only to the best of the writer’s knowledge. While the proposed program focuses on both clinical and developmental counseling by applying psychoanalytic play therapy. The proposed program consists of; Theoretical background such as; Behavior, and RET. It also consists of counseling procedures and therapeutic interventions.Keywords: self-esteem, gifted, program, design
Procedia PDF Downloads 4281542 Library Technologies and the Place of College Libraries in Teacher Training: Present Realities
Authors: Tony Ikponmwosa Obaseki
Abstract:
The paper studied Colleges of education environments with specific insight at available technologies in college libraries with the objective of ascertaining the services rendered and the impact of information services on teacher trainings in the overall development and benefit of the educational ecosystem. Problems were situated and assumptions formulated made to guide the study proper. Twelve (12) Colleges of education environment from the six geopolitical zones in Nigeria were comparatively studied, using twelve (12) librarians and six hundred (600) randomly selected training teachers. Analysis and presentation of findings will be done using well stated scientific procedures.Keywords: library, technologies, digital library, colleges of education, teacher training, education ecosystem
Procedia PDF Downloads 631541 Heuristic Spatial-Spectral Hyperspectral Image Segmentation Using Bands Quartile Box Plot Profiles
Authors: Mohamed A. Almoghalis, Osman M. Hegazy, Ibrahim F. Imam, Ali H. Elbastawessy
Abstract:
This paper presents a new hyperspectral image segmentation scheme with respect to both spatial and spectral contexts. The scheme uses the 8-pixels spatial pattern to build a weight structure that holds the number of outlier bands for each pixel among its neighborhood windows in different directions. The number of outlier bands for a pixel is obtained using bands quartile box plots profile among spatial 8-pixels pattern windows. The quartile box plot weight structure represents the spatial-spectral context in the image. Instead of starting segmentation process by single pixels, the proposed methodology starts by pixels groups that proved to share the same spectral features with respect to their spatial context. As a result, the segmentation scheme starts with Jigsaw pieces that build a mosaic image. The following step builds a model for each Jigsaw piece in the mosaic image. Each Jigsaw piece will be merged with another Jigsaw piece using KNN applied to their bands' quartile box plots profiles. The scheme iterates till required number of segments reached. Experiments use two data sets obtained from Earth Observer 1 (EO-1) sensor for Egypt and France. Initial results qualitative analysis showed encouraging results compared with ground truth. Quantitative analysis for the results will be included in the final paper.Keywords: hyperspectral image segmentation, image processing, remote sensing, box plot
Procedia PDF Downloads 6051540 Approach to Quantify Groundwater Recharge Using GIS Based Water Balance Model
Authors: S. S. Rwanga, J. M. Ndambuki
Abstract:
Groundwater quantification needs a method which is not only flexible but also reliable in order to accurately quantify its spatial and temporal variability. As groundwater is dynamic and interdisciplinary in nature, an integrated approach of remote sensing (RS) and GIS technique is very useful in various groundwater management studies. Thus, the GIS water balance model (WetSpass) together with remote sensing (RS) can be used to quantify groundwater recharge. This paper discusses the concept of WetSpass in combination with GIS on the quantification of recharge with a view to managing water resources in an integrated framework. The paper presents the simulation procedures and expected output after simulation. Preliminary data are presented from GIS output only.Keywords: groundwater, recharge, GIS, WetSpass
Procedia PDF Downloads 4501539 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R
Authors: Pavel H. Llamocca, Victoria Lopez
Abstract:
The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.Keywords: open data, R language, data integration, environmental data
Procedia PDF Downloads 315