Search results for: modifiable areal unit problem (MAUP)
7677 Discriminant Analysis as a Function of Predictive Learning to Select Evolutionary Algorithms in Intelligent Transportation System
Authors: Jorge A. Ruiz-Vanoye, Ocotlán Díaz-Parra, Alejandro Fuentes-Penna, Daniel Vélez-Díaz, Edith Olaco García
Abstract:
In this paper, we present the use of the discriminant analysis to select evolutionary algorithms that better solve instances of the vehicle routing problem with time windows. We use indicators as independent variables to obtain the classification criteria, and the best algorithm from the generic genetic algorithm (GA), random search (RS), steady-state genetic algorithm (SSGA), and sexual genetic algorithm (SXGA) as the dependent variable for the classification. The discriminant classification was trained with classic instances of the vehicle routing problem with time windows obtained from the Solomon benchmark. We obtained a classification of the discriminant analysis of 66.7%.Keywords: Intelligent Transportation Systems, data-mining techniques, evolutionary algorithms, discriminant analysis, machine learning
Procedia PDF Downloads 4727676 Administrative Traits and Capabilities of Mindanao State University Heads of Office as Perceived by Their Subordinates
Authors: Johanida L. Etado
Abstract:
The study determined the Administrative traits and capabilities of Mindanao State University Heads of office as perceived by their respondents. Specifically, this study attempted to find out: To get the primary data, a self- constructed survey questionnaire which was validated by a panel of experts, including the adviser. Most of the MSU head of office were aware of their duties and responsibilities as a manager. Considering their vast knowledge and expertise on the technical or task aspects of the job, it is not surprising that respondents perceived them to a high degree as work or task oriented. MSU head of office were knowledgeable and capable in performing field-specific, specialized tasks and enabling them to coordinate work, solve problems, communicate effectively, and also understand the big picture in light of the front-line work that must be performed. The significance of coaching or mentoring in this instance may be explained by the less number of Master’s or Doctorate degree holder among employees resulting to close supervision and mentorship of head of office towards the latter; Without comparison, interpersonal or human relation capabilities is a very effective way in dealing with people as it gives them the opportunity to influence their employees. In the case of MSU head of office, the best way of dealing with problematic employees is by establishing trust and allowing them to partake in the decision making even on setting organizational goals as it would make them feel part of the organization; Thus, it is recommended that the success of an organization depends largely with the effectiveness of the head of unit. In this case, being development oriented would mean encouraging both head officers & employees to know not only the technical know hoe of the organisation but also the visions, missions, goals & the latter’s aspirations to establish cooperation & harmonious working environment; hence, orientation & reorientation time to time would enable them to be more development oriented; With respect to human relations, effective interpersonal relationship between head of unit & employee is of paramount importance. In order to strengthen the relationship between the two, the management should establish an upward & downward communication where two parties will have to establish an open & transparent communication, either through verbal & non-verbal one.Keywords: administrator, administrative traits, leadership traits, work orientation
Procedia PDF Downloads 717675 Development of a Nursing Care Program Based on Anthroposophic External Therapy for the Pediatric Hospital in Brazil and Germany
Authors: Karina Peron, Ricardo Ghelman, Monica Taminato, Katia R. Oliveira, Debora C. A. Rodrigues, Juliana R. C. Mumme, Olga K. M. Sunakozaua, Georg Seifert, Vicente O. Filho
Abstract:
The nurse is the most available health professional for the interventions of support in the integrative approach in hospital environment, therefore a professional group key to changes in the model of care. The central components in the performance of anthroposophic nursing procedures are direct physical contact, promotion of proper rhythm, thermal regulation and the construction of a calm and empathic atmosphere, safe for patients and their caregivers. The procedures of anthroposophic external therapies (AET), basically composed of the application of compresses and the use of natural products, provide an opportunity to intensify the therapeutic results through an innovative, complementary and integrative model in the university hospital. The objective of this work is to report the implementation of a program of nursing techniques (AET) through a partnership between the Pediatric Oncology Sector of the Department of Pediatrics of the Faculty of Medicine of the University of Sao Paulo and Charite University of Berlin, with lecturers from Berlin's Integrative Hospital Havelhöhe and Witten-Herdecke Integrative Hospital, both in Germany. Intensive training activities of the Hospital's nursing staff and a survey on AET needs were developed based on the most prevalent complaints in pediatric oncology patients in the three environments of the Hospital of Pediatric Oncology: Bone Marrow Transplantation Unit, Intensive Care Unit and Division of Internal Patients. We obtained the approval of the clinical protocol of external anthroposophic therapies for nursing care by the Ethics Committee and the Academic Council of the Hospital. With this project, we highlight the key AET needs that will be part of the standard program of pediatric oncology care with appropriate scientific support. The results of the prevalent symptoms were: vomiting, nausea, pain, difficulty in starting sleep, constipation, cold extremities, mood disorder and psychomotor agitation. This project was the pioneer within the Integrative Pediatrics Program, as an innovative concept of Medicine and Integrative Health presented at scientific meetings.Keywords: integrative health care, integrative nursing, pediatric nursing, pediatric oncology
Procedia PDF Downloads 2667674 Graph Planning Based Composition for Adaptable Semantic Web Services
Authors: Rihab Ben Lamine, Raoudha Ben Jemaa, Ikram Amous Ben Amor
Abstract:
This paper proposes a graph planning technique for semantic adaptable Web Services composition. First, we use an ontology based context model for extending Web Services descriptions with information about the most suitable context for its use. Then, we transform the composition problem into a semantic context aware graph planning problem to build the optimal service composition based on user's context. The construction of the planning graph is based on semantic context aware Web Service discovery that allows for each step to add most suitable Web Services in terms of semantic compatibility between the services parameters and their context similarity with the user's context. In the backward search step, semantic and contextual similarity scores are used to find best composed Web Services list. Finally, in the ranking step, a score is calculated for each best solution and a set of ranked solutions is returned to the user.Keywords: semantic web service, web service composition, adaptation, context, graph planning
Procedia PDF Downloads 5207673 Dynamic Response around Inclusions in Infinitely Inhomogeneous Media
Authors: Jinlai Bian, Zailin Yang, Guanxixi Jiang, Xinzhu Li
Abstract:
The problem of elastic wave propagation in inhomogeneous medium has always been a classic problem. Due to the frequent occurrence of earthquakes, many economic losses and casualties have been caused, therefore, to prevent earthquake damage to people and reduce damage, this paper studies the dynamic response around the circular inclusion in the whole space with inhomogeneous modulus, the inhomogeneity of the medium is reflected in the shear modulus of the medium with the spatial position, and the density is constant, this method can be used to solve the problem of the underground buried pipeline. Stress concentration phenomena are common in aerospace and earthquake engineering, and the dynamic stress concentration factor (DSCF) is one of the main factors leading to material damage, one of the important applications of the theory of elastic dynamics is to determine the stress concentration in the body with discontinuities such as cracks, holes, and inclusions. At present, the methods include wave function expansion method, integral transformation method, integral equation method and so on. Based on the complex function method, the Helmholtz equation with variable coefficients is standardized by using conformal transformation method and wave function expansion method, the displacement and stress fields in the whole space with circular inclusions are solved in the complex coordinate system, the unknown coefficients are solved by using boundary conditions, by comparing with the existing results, the correctness of this method is verified, based on the superiority of the complex variable function theory to the conformal transformation, this method can be extended to study the inclusion problem of arbitrary shapes. By solving the dynamic stress concentration factor around the inclusions, the influence of the inhomogeneous parameters of the medium and the wavenumber ratio of the inclusions to the matrix on the dynamic stress concentration factor is analyzed. The research results can provide some reference value for the evaluation of nondestructive testing (NDT), oil exploration, seismic monitoring, and soil-structure interaction.Keywords: circular inclusions, complex variable function, dynamic stress concentration factor (DSCF), inhomogeneous medium
Procedia PDF Downloads 1357672 Aromatic Medicinal Plant Classification Using Deep Learning
Authors: Tsega Asresa Mengistu, Getahun Tigistu
Abstract:
Computer vision is an artificial intelligence subfield that allows computers and systems to retrieve meaning from digital images. It is applied in various fields of study self-driving cars, video surveillance, agriculture, Quality control, Health care, construction, military, and everyday life. Aromatic and medicinal plants are botanical raw materials used in cosmetics, medicines, health foods, and other natural health products for therapeutic and Aromatic culinary purposes. Herbal industries depend on these special plants. These plants and their products not only serve as a valuable source of income for farmers and entrepreneurs, and going to export not only industrial raw materials but also valuable foreign exchange. There is a lack of technologies for the classification and identification of Aromatic and medicinal plants in Ethiopia. The manual identification system of plants is a tedious, time-consuming, labor, and lengthy process. For farmers, industry personnel, academics, and pharmacists, it is still difficult to identify parts and usage of plants before ingredient extraction. In order to solve this problem, the researcher uses a deep learning approach for the efficient identification of aromatic and medicinal plants by using a convolutional neural network. The objective of the proposed study is to identify the aromatic and medicinal plant Parts and usages using computer vision technology. Therefore, this research initiated a model for the automatic classification of aromatic and medicinal plants by exploring computer vision technology. Morphological characteristics are still the most important tools for the identification of plants. Leaves are the most widely used parts of plants besides the root, flower and fruit, latex, and barks. The study was conducted on aromatic and medicinal plants available in the Ethiopian Institute of Agricultural Research center. An experimental research design is proposed for this study. This is conducted in Convolutional neural networks and Transfer learning. The Researcher employs sigmoid Activation as the last layer and Rectifier liner unit in the hidden layers. Finally, the researcher got a classification accuracy of 66.4 in convolutional neural networks and 67.3 in mobile networks, and 64 in the Visual Geometry Group.Keywords: aromatic and medicinal plants, computer vision, deep convolutional neural network
Procedia PDF Downloads 4387671 Supplier Selection Using Sustainable Criteria in Sustainable Supply Chain Management
Authors: Richa Grover, Rahul Grover, V. Balaji Rao, Kavish Kejriwal
Abstract:
Selection of suppliers is a crucial problem in the supply chain management. On top of that, sustainable supplier selection is the biggest challenge for the organizations. Environment protection and social problems have been of concern to society in recent years, and the traditional supplier selection does not consider about this factor; therefore, this research work focuses on introducing sustainable criteria into the structure of supplier selection criteria. Sustainable Supply Chain Management (SSCM) is the management and administration of material, information, and money flows, as well as coordination among business along the supply chain. All three dimensions - economic, environmental, and social - of sustainable development needs to be taken care of. Purpose of this research is to maximize supply chain profitability, maximize social wellbeing of supply chain and minimize environmental impacts. Problem statement is selection of suppliers in a sustainable supply chain network by ranking the suppliers against sustainable criteria identified. The aim of this research is twofold: To find out what are the sustainable parameters that can be applied to the supply chain, and to determine how these parameters can effectively be used in supplier selection. Multicriteria decision making tools will be used to rank both criteria and suppliers. AHP Analysis will be used to find out ratings for the criteria identified. It is a technique used for efficient decision making. TOPSIS will be used to find out rating for suppliers and then ranking them. TOPSIS is a MCDM problem solving method which is based on the principle that the chosen option should have the maximum distance from the negative ideal solution (NIS) and the minimum distance from the ideal solution.Keywords: sustainable supply chain management, sustainable criteria, MCDM tools, AHP analysis, TOPSIS method
Procedia PDF Downloads 3257670 A Collaborative Application of Six Sigma and Value Engineering in Supply Chain and Logistics
Authors: Arun Raja, Kevin Thomas, Sreyas Tribhu, S. P. Anbuudayasankar
Abstract:
This paper deals with the application of six sigma methodology in supply chain (SC) and logistics. A detailed cram about how the SC can be improved and its impact on the organization are dealt with and also how the quality plays a vital role in improving SC and logistics are identified. A simulation has been performed using the ARENA software to determine the process efficiency of a bottle manufacturing unit. Further, a Value Stream Mapping (VSM) analysis has been executed on the manufacturing process flow model and the manner by which Value Engineering (VE) holds a significant importance for quality assertion on the products is also studied.Keywords: supply chain, six sigma, value engineering, logistics, quality
Procedia PDF Downloads 6787669 Durability Analysis of a Knuckle Arm Using VPG System
Authors: Geun-Yeon Kim, S. P. Praveen Kumar, Kwon-Hee Lee
Abstract:
A steering knuckle arm is the component that connects the steering system and suspension system. The structural performances such as stiffness, strength, and durability are considered in its design process. The former study suggested the lightweight design of a knuckle arm considering the structural performances and using the metamodel-based optimization. The six shape design variables were defined, and the optimum design was calculated by applying the kriging interpolation method. The finite element method was utilized to predict the structural responses. The suggested knuckle was made of the aluminum Al6082, and its weight was reduced about 60% in comparison with the base steel knuckle, satisfying the design requirements. Then, we investigated its manufacturability by performing foraging analysis. The forging was done as hot process, and the product was made through two-step forging. As a final step of its developing process, the durability is investigated by using the flexible dynamic analysis software, LS-DYNA and the pre and post processor, eta/VPG. Generally, a car make does not provide all the information with the part manufacturer. Thus, the part manufacturer has a limit in predicting the durability performance with the unit of full car. The eta/VPG has the libraries of suspension, tire, and road, which are commonly used parts. That makes a full car modeling. First, the full car is modeled by referencing the following information; Overall Length: 3,595mm, Overall Width: 1,595mm, CVW (Curve Vehicle Weight): 910kg, Front Suspension: MacPherson Strut, Rear Suspension: Torsion Beam Axle, Tire: 235/65R17. Second, the road is selected as the cobblestone. The road condition of the cobblestone is almost 10 times more severe than that of usual paved road. Third, the dynamic finite element analysis using the LS-DYNA is performed to predict the durability performance of the suggested knuckle arm. The life of the suggested knuckle arm is calculated as 350,000km, which satisfies the design requirement set up by the part manufacturer. In this study, the overall design process of a knuckle arm is suggested, and it can be seen that the developed knuckle arm satisfies the design requirement of the durability with the unit of full car. The VPG analysis is successfully performed even though it does not an exact prediction since the full car model is very rough one. Thus, this approach can be used effectively when the detail to full car is not given.Keywords: knuckle arm, structural optimization, Metamodel, forging, durability, VPG (Virtual Proving Ground)
Procedia PDF Downloads 4197668 Vancomycin Resistance Enterococcus and Implications to Trauma and Orthopaedic Care
Authors: O. Davies, K. Veravalli, P. Panwalkar, M. Tofighi, P. Butterick, B. Healy, A. Mofidi
Abstract:
Vancomycin resistant enterococcus infection is a condition that usually impacts ICUs, transplant, dialysis, and cancer units, often as a nosocomial infection. After an outbreak in the acute trauma and orthopaedic unit in Morriston hospital, we aimed to access the conditions that predispose VRE infections in our unit. Thirteen cases of VRE infection and five cases of VRE colonisations were identified in patients who were treated for orthopaedic care between 1/1/2020 and 1/11/2021. Cases were reviewed to identify predisposing factors, specifically looking at age, presenting condition and treatment, presence of infection and antibiotic care, active haemo-oncological condition, long term renal dialysis, previous hospitalisation, VRE predisposition, and clearance (PREVENT) scores, and outcome of care. The presenting condition, treatment, presence of postoperative infection, VRE scores, age was compared between colonised and the infected cohort. VRE type in both colonised and infection group was Enterococcus Faecium in all but one patient. The colonised group had the same age (T=0.6 P>0.05) and sex (2=0.115, p=0.74), presenting condition and treatment which consisted of peri-femoral fixation or arthroplasty in all patients. The infected group had one case of myelodysplasia and four cases of chronic renal failure requiring dialysis. All of the infected patient had sustained an infected complication of their fracture fixation or arthroplasty requiring reoperation and antibiotics. The infected group had an average VRE predisposition score of 8.5 versus the score of 3 in the colonised group (F=36, p<0.001). PREVENT score was 7 in the infected group and 2 in the colonised group(F=153, p<0.001). Six patients(55%) succumbed to their infection, and one VRE infection resulted in limb loss. In the orthopaedic cohort, VRE infection is a nosocomial condition that has peri-femoral predilection and is seen in association with immunosuppression or renal failure. The VRE infection cohort has been treated for infective complication of original surgery weeks prior to VRE infection. Based on our findings, we advise avoidance of infective complications, change of practice in use of antibiotics and use radical surgery and surveillance for VRE infections beyond infective precautions. PREVENT score shows that the infected group are unlikely to clear their VRE in the future but not the colonised group.Keywords: surgical site infection, enterococcus, orthopaedic surgery, vancomycin resistance
Procedia PDF Downloads 1497667 A Modular Solution for Large-Scale Critical Industrial Scheduling Problems with Coupling of Other Optimization Problems
Authors: Ajit Rai, Hamza Deroui, Blandine Vacher, Khwansiri Ninpan, Arthur Aumont, Francesco Vitillo, Robert Plana
Abstract:
Large-scale critical industrial scheduling problems are based on Resource-Constrained Project Scheduling Problems (RCPSP), that necessitate integration with other optimization problems (e.g., vehicle routing, supply chain, or unique industrial ones), thus requiring practical solutions (i.e., modular, computationally efficient with feasible solutions). To the best of our knowledge, the current industrial state of the art is not addressing this holistic problem. We propose an original modular solution that answers the issues exhibited by the delivery of complex projects. With three interlinked entities (project, task, resources) having their constraints, it uses a greedy heuristic with a dynamic cost function for each task with a situational assessment at each time step. It handles large-scale data and can be easily integrated with other optimization problems, already existing industrial tools and unique constraints as required by the use case. The solution has been tested and validated by domain experts on three use cases: outage management in Nuclear Power Plants (NPPs), planning of future NPP maintenance operation, and application in the defense industry on supply chain and factory relocation. In the first use case, the solution, in addition to the resources’ availability and tasks’ logical relationships, also integrates several project-specific constraints for outage management, like, handling of resource incompatibility, updating of tasks priorities, pausing tasks in a specific circumstance, and adjusting dynamic unit of resources. With more than 20,000 tasks and multiple constraints, the solution provides a feasible schedule within 10-15 minutes on a standard computer device. This time-effective simulation corresponds with the nature of the problem and requirements of several scenarios (30-40 simulations) before finalizing the schedules. The second use case is a factory relocation project where production lines must be moved to a new site while ensuring the continuity of their production. This generates the challenge of merging job shop scheduling and the RCPSP with location constraints. Our solution allows the automation of the production tasks while considering the rate expectation. The simulation algorithm manages the use and movement of resources and products to respect a given relocation scenario. The last use case establishes a future maintenance operation in an NPP. The project contains complex and hard constraints, like on Finish-Start precedence relationship (i.e., successor tasks have to start immediately after predecessors while respecting all constraints), shareable coactivity for managing workspaces, and requirements of a specific state of "cyclic" resources (they can have multiple states possible with only one at a time) to perform tasks (can require unique combinations of several cyclic resources). Our solution satisfies the requirement of minimization of the state changes of cyclic resources coupled with the makespan minimization. It offers a solution of 80 cyclic resources with 50 incompatibilities between levels in less than a minute. Conclusively, we propose a fast and feasible modular approach to various industrial scheduling problems that were validated by domain experts and compatible with existing industrial tools. This approach can be further enhanced by the use of machine learning techniques on historically repeated tasks to gain further insights for delay risk mitigation measures.Keywords: deterministic scheduling, optimization coupling, modular scheduling, RCPSP
Procedia PDF Downloads 1987666 Preference Aggregation and Mechanism Design in the Smart Grid
Authors: Zaid Jamal Saeed Almahmoud
Abstract:
Smart Grid is the vision of the future power system that combines advanced monitoring and communication technologies to provide energy in a smart, efficient, and user-friendly manner. This proposal considers a demand response model in the Smart Grid based on utility maximization. Given a set of consumers with conflicting preferences in terms of consumption and a utility company that aims to minimize the peak demand and match demand to supply, we study the problem of aggregating these preferences while modelling the problem as a game. We also investigate whether an equilibrium can be reached to maximize the social benefit. Based on such equilibrium, we propose a dynamic pricing heuristic that computes the equilibrium and sets the prices accordingly. The developed approach was analysed theoretically and evaluated experimentally using real appliances data. The results show that our proposed approach achieves a substantial reduction in the overall energy consumption.Keywords: heuristics, smart grid, aggregation, mechanism design, equilibrium
Procedia PDF Downloads 1127665 Identification of Spam Keywords Using Hierarchical Category in C2C E-Commerce
Authors: Shao Bo Cheng, Yong-Jin Han, Se Young Park, Seong-Bae Park
Abstract:
Consumer-to-Consumer (C2C) E-commerce has been growing at a very high speed in recent years. Since identical or nearly-same kinds of products compete one another by relying on keyword search in C2C E-commerce, some sellers describe their products with spam keywords that are popular but are not related to their products. Though such products get more chances to be retrieved and selected by consumers than those without spam keywords, the spam keywords mislead the consumers and waste their time. This problem has been reported in many commercial services like e-bay and taobao, but there have been little research to solve this problem. As a solution to this problem, this paper proposes a method to classify whether keywords of a product are spam or not. The proposed method assumes that a keyword for a given product is more reliable if the keyword is observed commonly in specifications of products which are the same or the same kind as the given product. This is because that a hierarchical category of a product in general determined precisely by a seller of the product and so is the specification of the product. Since higher layers of the hierarchical category represent more general kinds of products, a reliable degree is differently determined according to the layers. Hence, reliable degrees from different layers of a hierarchical category become features for keywords and they are used together with features only from specifications for classification of the keywords. Support Vector Machines are adopted as a basic classifier using the features, since it is powerful, and widely used in many classification tasks. In the experiments, the proposed method is evaluated with a golden standard dataset from Yi-han-wang, a Chinese C2C e-commerce, and is compared with a baseline method that does not consider the hierarchical category. The experimental results show that the proposed method outperforms the baseline in F1-measure, which proves that spam keywords are effectively identified by a hierarchical category in C2C e-commerce.Keywords: spam keyword, e-commerce, keyword features, spam filtering
Procedia PDF Downloads 2947664 Survival Analysis after a First Ischaemic Stroke Event: A Case-Control Study in the Adult Population of England.
Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski
Abstract:
Stroke is associated with a significant risk of morbidity and mortality. There is scarcity of research on the long-term survival after first-ever ischaemic stroke (IS) events in England with regards to effects of different medical therapies and comorbidities. The objective of this study was to model the all-cause mortality after an IS diagnosis in the adult population of England. Using a retrospective case-control design, we extracted the electronic medical records of patients born prior to or in year 1960 in England with a first-ever ischaemic stroke diagnosis from January 1986 to January 2017 within the Health and Improvement Network (THIN) database. Participants with a history of ischaemic stroke were matched to 3 controls by sex and age at diagnosis and general practice. The primary outcome was the all-cause mortality. The hazards of the all-cause mortality were estimated using a Weibull-Cox survival model which included both scale and shape effects and a shared random effect of general practice. The model included sex, birth cohort, socio-economic status, comorbidities and medical therapies. 20,250 patients with a history of IS (cases) and 55,519 controls were followed up to 30 years. From 2008 to 2015, the one-year all-cause mortality for the IS patients declined with an absolute change of -0.5%. Preventive treatments to cases increased considerably over time. These included prescriptions of statins and antihypertensives. However, prescriptions for antiplatelet drugs decreased in the routine general practice since 2010. The survival model revealed a survival benefit of antiplatelet treatment to stroke survivors with hazard ratio (HR) of 0.92 (0.90 – 0.94). IS diagnosis had significant interactions with gender and age at entry and hypertension diagnosis. IS diagnosis was associated with high risk of all-cause mortality with HR= 3.39 (3.05-3.72) for cases compared to controls. Hypertension was associated with poor survival with HR = 4.79 (4.49 - 5.09) for hypertensive cases relative to non-hypertensive controls, though the detrimental effect of hypertension has not reached significance for hypertensive controls, HR = 1.19(0.82-1.56). This study of English primary care data showed that between 2008 and 2015, the rates of prescriptions of stroke preventive treatments increased, and a short-term all-cause mortality after IS stroke declined. However, stroke resulted in poor long-term survival. Hypertension, a modifiable risk factor, was found to be associated with poor survival outcomes in IS patients. Antiplatelet drugs were found to be protective to survival. Better efforts are required to reduce the burden of stroke through health service development and primary prevention.Keywords: general practice, hazard ratio, health improvement network (THIN), ischaemic stroke, multiple imputation, Weibull-Cox model.
Procedia PDF Downloads 1867663 Study on Runoff Allocation Responsibilities of Different Land Uses in a Single Catchment Area
Authors: Chuan-Ming Tung, Jin-Cheng Fu, Chia-En Feng
Abstract:
In recent years, the rapid development of urban land in Taiwan has led to the constant increase of the areas of impervious surface, which has increased the risk of waterlogging during heavy rainfall. Therefore, in recent years, promoting runoff allocation responsibilities has often been used as a means of reducing regional flooding. In this study, the single catchment area covering both urban and rural land as the study area is discussed. Based on Storm Water Management Model, urban and rural land in a single catchment area was explored to develop the runoff allocation responsibilities according to their respective control regulation on land use. The impacts of runoff increment and reduction in sub-catchment area were studied to understand the impact of highly developed urban land on the reduction of flood risk of rural land at the back end. The results showed that the rainfall with 1 hour short delay of 2 years, 5 years, 10 years, and 25 years return period. If the study area was fully developed, the peak discharge at the outlet would increase by 24.46% -22.97% without runoff allocation responsibilities. The front-end urban land would increase runoff from back-end of rural land by 76.19% -46.51%. However, if runoff allocation responsibilities were carried out in the study area, the peak discharge could be reduced by 58.38-63.08%, which could make the front-end to reduce 54.05% -23.81% of the peak flow to the back-end. In addition, the researchers found that if it was seen from the perspective of runoff allocation responsibilities of per unit area, the residential area of urban land would benefit from the relevant laws and regulations of the urban system, which would have a better effect of reducing flood than the residential land in rural land. For rural land, the development scale of residential land was generally small, which made the effect of flood reduction better than that of industrial land. Agricultural land requires a large area of land, resulting in the lowest share of the flow per unit area. From the point of the planners, this study suggests that for the rural land around the city, its responsibility should be assigned to share the runoff. And setting up rain water storage facilities in the same way as urban land, can also take stock of agricultural land resources to increase the ridge of field for flood storage, in order to improve regional disaster reduction capacity and resilience.Keywords: runoff allocation responsibilities, land use, flood mitigation, SWMM
Procedia PDF Downloads 1047662 Relay Node Selection Algorithm for Cooperative Communications in Wireless Networks
Authors: Sunmyeng Kim
Abstract:
IEEE 802.11a/b/g standards support multiple transmission rates. Even though the use of multiple transmission rates increase the WLAN capacity, this feature leads to the performance anomaly problem. Cooperative communication was introduced to relieve the performance anomaly problem. Data packets are delivered to the destination much faster through a relay node with high rate than through direct transmission to the destination at low rate. In the legacy cooperative protocols, a source node chooses a relay node only based on the transmission rate. Therefore, they are not so feasible in multi-flow environments since they do not consider the effect of other flows. To alleviate the effect, we propose a new relay node selection algorithm based on the transmission rate and channel contention level. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and delay.Keywords: cooperative communications, MAC protocol, relay node, WLAN
Procedia PDF Downloads 3327661 Investigating the Dynamics of Knowledge Acquisition in Undergraduate Mathematics Students Using Differential Equations
Authors: Gilbert Makanda
Abstract:
The problem of the teaching of mathematics is studied using differential equations. A mathematical model for knowledge acquisition in mathematics is developed. In this study we adopt the mathematical model that is normally used for disease modelling in the teaching of mathematics. It is assumed that teaching is 'infecting' students with knowledge thereby spreading this knowledge to the students. It is also assumed that students who gain this knowledge spread it to other students making disease model appropriate to adopt for this problem. The results of this study show that increasing recruitment rates, learning contact with teachers and learning materials improves the number of knowledgeable students. High dropout rates and forgetting taught concepts also negatively affect the number of knowledgeable students. The developed model is then solved using Matlab ODE45 and \verb"lsqnonlin" to estimate parameters for the actual data.Keywords: differential equations, knowledge acquisition, least squares, dynamical systems
Procedia PDF Downloads 4237660 Introduction to Multi-Agent Deep Deterministic Policy Gradient
Authors: Xu Jie
Abstract:
As a key network security method, cryptographic services must fully cope with problems such as the wide variety of cryptographic algorithms, high concurrency requirements, random job crossovers, and instantaneous surges in workloads. Its complexity and dynamics also make it difficult for traditional static security policies to cope with the ever-changing situation. Cyber Threats and Environment. Traditional resource scheduling algorithms are inadequate when facing complex decisionmaking problems in dynamic environments. A network cryptographic resource allocation algorithm based on reinforcement learning is proposed, aiming to optimize task energy consumption, migration cost, and fitness of differentiated services (including user, data, and task security). By modeling the multi-job collaborative cryptographic service scheduling problem as a multiobjective optimized job flow scheduling problem, and using a multi-agent reinforcement learning method, efficient scheduling and optimal configuration of cryptographic service resources are achieved. By introducing reinforcement learning, resource allocation strategies can be adjusted in real time in a dynamic environment, improving resource utilization and achieving load balancing. Experimental results show that this algorithm has significant advantages in path planning length, system delay and network load balancing, and effectively solves the problem of complex resource scheduling in cryptographic services.Keywords: multi-agent reinforcement learning, non-stationary dynamics, multi-agent systems, cooperative and competitive agents
Procedia PDF Downloads 237659 The Education Quality Management by the Participation of the Community in Northern Part of Thailand
Authors: Preecha Pongpeng
Abstract:
This research aims to study the education quality management to solve the problem of teachers shortage by the communities participation. This research is action research by using the tools is questionnaire to collect the data whit, students and community representatives and final will interview to ask the opinions of people in the community to help and support instruction in problems in teaching. Results found that people in the community are aware and working together to solve the lack the of teachers by collaboration between school personnel and community members by finding people who are knowledgeable, organized into local wisdom in the community, compound money to donate and hire someone in the community to teaching between classroom with people in the community. In addition, researcher discovered this research project contributes to cooperation between the school and community and there was a problem including administrative expenses and the school's academic quality management.Keywords: education quality management, local wisdom, northern part of Thailand, participation of the community
Procedia PDF Downloads 2937658 On the Performance of Improvised Generalized M-Estimator in the Presence of High Leverage Collinearity Enhancing Observations
Authors: Habshah Midi, Mohammed A. Mohammed, Sohel Rana
Abstract:
Multicollinearity occurs when two or more independent variables in a multiple linear regression model are highly correlated. The ridge regression is the commonly used method to rectify this problem. However, the ridge regression cannot handle the problem of multicollinearity which is caused by high leverage collinearity enhancing observation (HLCEO). Since high leverage points (HLPs) are responsible for inducing multicollinearity, the effect of HLPs needs to be reduced by using Generalized M estimator. The existing GM6 estimator is based on the Minimum Volume Ellipsoid (MVE) which tends to swamp some low leverage points. Hence an improvised GM (MGM) estimator is presented to improve the precision of the GM6 estimator. Numerical example and simulation study are presented to show how HLPs can cause multicollinearity. The numerical results show that our MGM estimator is the most efficient method compared to some existing methods.Keywords: identification, high leverage points, multicollinearity, GM-estimator, DRGP, DFFITS
Procedia PDF Downloads 2627657 Challenge Response-Based Authentication for a Mobile Voting System
Authors: Tohari Ahmad, Hudan Studiawan, Iwang Aryadinata, Royyana M. Ijtihadie, Waskitho Wibisono
Abstract:
A manual voting system has been implemented worldwide. It has some weaknesses which may decrease the legitimacy of the voting result. An electronic voting system is introduced to minimize this weakness. It has been able to provide a better result, in terms of the total time taken in the voting process and accuracy. Nevertheless, people may be reluctant to go to the polling location because of some reasons, such as distance and time. In order to solve this problem, mobile voting is implemented by utilizing mobile devices. There are many mobile voting architectures available. Overall, authenticity of the users is the common problem of all voting systems. There must be a mechanism which can verify the users’ authenticity such that only verified users can give their vote once; others cannot vote. In this paper, a challenge response-based authentication is proposed by utilizing properties of the users, for example, something they have and know. In terms of speed, the proposed system provides good result, in addition to other capabilities offered by the system.Keywords: authentication, data protection, mobile voting, security
Procedia PDF Downloads 4197656 Integrating Explicit Instruction and Problem-Solving Approaches for Efficient Learning
Authors: Slava Kalyuga
Abstract:
There are two opposing major points of view on the optimal degree of initial instructional guidance that is usually discussed in the literature by the advocates of the corresponding learning approaches. Using unguided or minimally guided problem-solving tasks prior to explicit instruction has been suggested by productive failure and several other instructional theories, whereas an alternative approach - using fully guided worked examples followed by problem solving - has been demonstrated as the most effective strategy within the framework of cognitive load theory. An integrated approach discussed in this paper could combine the above frameworks within a broader theoretical perspective which would allow bringing together their best features and advantages in the design of learning tasks for STEM education. This paper represents a systematic review of the available empirical studies comparing the above alternative sequences of instructional methods to explore effects of several possible moderating factors. The paper concludes that different approaches and instructional sequences should coexist within complex learning environments. Selecting optimal sequences depends on such factors as specific goals of learner activities, types of knowledge to learn, levels of element interactivity (task complexity), and levels of learner prior knowledge. This paper offers an outline of a theoretical framework for the design of complex learning tasks in STEM education that would integrate explicit instruction and inquiry (exploratory, discovery) learning approaches in ways that depend on a set of defined specific factors.Keywords: cognitive load, explicit instruction, exploratory learning, worked examples
Procedia PDF Downloads 1257655 Informed Urban Design: Minimizing Urban Heat Island Intensity via Stochastic Optimization
Authors: Luis Guilherme Resende Santos, Ido Nevat, Leslie Norford
Abstract:
The Urban Heat Island (UHI) is characterized by increased air temperatures in urban areas compared to undeveloped rural surrounding environments. With urbanization and densification, the intensity of UHI increases, bringing negative impacts on livability, health and economy. In order to reduce those effects, it is required to take into consideration design factors when planning future developments. Given design constraints such as population size and availability of area for development, non-trivial decisions regarding the buildings’ dimensions and their spatial distribution are required. We develop a framework for optimization of urban design in order to jointly minimize UHI intensity and buildings’ energy consumption. First, the design constraints are defined according to spatial and population limits in order to establish realistic boundaries that would be applicable in real life decisions. Second, the tools Urban Weather Generator (UWG) and EnergyPlus are used to generate outputs of UHI intensity and total buildings’ energy consumption, respectively. Those outputs are changed based on a set of variable inputs related to urban morphology aspects, such as building height, urban canyon width and population density. Lastly, an optimization problem is cast where the utility function quantifies the performance of each design candidate (e.g. minimizing a linear combination of UHI and energy consumption), and a set of constraints to be met is set. Solving this optimization problem is difficult, since there is no simple analytic form which represents the UWG and EnergyPlus models. We therefore cannot use any direct optimization techniques, but instead, develop an indirect “black box” optimization algorithm. To this end we develop a solution that is based on stochastic optimization method, known as the Cross Entropy method (CEM). The CEM translates the deterministic optimization problem into an associated stochastic optimization problem which is simple to solve analytically. We illustrate our model on a typical residential area in Singapore. Due to fast growth in population and built area and land availability generated by land reclamation, urban planning decisions are of the most importance for the country. Furthermore, the hot and humid climate in the country raises the concern for the impact of UHI. The problem presented is highly relevant to early urban design stages and the objective of such framework is to guide decision makers and assist them to include and evaluate urban microclimate and energy aspects in the process of urban planning.Keywords: building energy consumption, stochastic optimization, urban design, urban heat island, urban weather generator
Procedia PDF Downloads 1317654 Cluster Analysis and Benchmarking for Performance Optimization of a Pyrochlore Processing Unit
Authors: Ana C. R. P. Ferreira, Adriano H. P. Pereira
Abstract:
Given the frequent variation of mineral properties throughout the Araxá pyrochlore deposit, even if a good homogenization work has been carried out before feeding the processing plants, an operation with quality and performance’s high variety standard is expected. These results could be improved and standardized if the blend composition parameters that most influence the processing route are determined, and then the types of raw materials are grouped by them, finally presenting a great reference with operational settings for each group. Associating the physical and chemical parameters of a unit operation through benchmarking or even an optimal reference of metallurgical recovery and product quality reflects in the reduction of the production costs, optimization of the mineral resource, and guarantee of greater stability in the subsequent processes of the production chain that uses the mineral of interest. Conducting a comprehensive exploratory data analysis to identify which characteristics of the ore are most relevant to the process route, associated with the use of Machine Learning algorithms for grouping the raw material (ore) and associating these with reference variables in the process’ benchmark is a reasonable alternative for the standardization and improvement of mineral processing units. Clustering methods through Decision Tree and K-Means were employed, associated with algorithms based on the theory of benchmarking, with criteria defined by the process team in order to reference the best adjustments for processing the ore piles of each cluster. A clean user interface was created to obtain the outputs of the created algorithm. The results were measured through the average time of adjustment and stabilization of the process after a new pile of homogenized ore enters the plant, as well as the average time needed to achieve the best processing result. Direct gains from the metallurgical recovery of the process were also measured. The results were promising, with a reduction in the adjustment time and stabilization when starting the processing of a new ore pile, as well as reaching the benchmark. Also noteworthy are the gains in metallurgical recovery, which reflect a significant saving in ore consumption and a consequent reduction in production costs, hence a more rational use of the tailings dams and life optimization of the mineral deposit.Keywords: mineral clustering, machine learning, process optimization, pyrochlore processing
Procedia PDF Downloads 1437653 Developing a Health Promotion Program to Prevent and Solve Problem of the Frailty Elderly in the Community
Authors: Kunthida Kulprateepunya, Napat Boontiam, Bunthita Phuasa, Chatsuda Kankayant, Bantoeng Polsawat, Sumran Poontong
Abstract:
Frailty is the thin line between good health and illness. The syndrome is more common in the elderly who transition from strong to weak. (Vulnerability). Fragility can prevent and promote healthy recovery before it goes into disability. This research and development aim to analyze the situation analysis of frailty of the elderly, develop a program, and evaluate the effect of a health promotion program to prevent and solve the problem of frailty among the elderly. The research consisted of 3 phases: 1) analysis of the frailty situation, 2) development of a model, 3) evaluation of the effectiveness of the model. Samples were 328, 122 elderlies using the multi-stage random sampling method. The research instrument was a frailty questionnaire use of the five symptoms, the main characteristics were muscle weakness, slow walking, low physical activity. Fatigue and unintentional weight loss, criteria frailty use more than or equal to three or more symptoms are frailty. Data were analyzed by descriptive and t-test dependent test statistics. The findings showed three parts. First, frailty in the elderly was 23.05 percentage and 56.70% pre-frailty. Second, it was development of a health promotion program to prevent and solve the problem of frailty the elderly with a combination of Nine-Square Exercise, Elastic Band Exercise, Elastic Coconut Shell. Third, evaluation of the effectiveness of the model by comparison of the elderly's get up and go test, the average time before using the program was 14.42 and after using the program was 8.57. It was statistically significant at the .05 level. In conclusion, the findings can used to develop guidelines to promote the health of the frailty elderly.Keywords: elderly, fragile, nine-square exercise, elastic coconut shell
Procedia PDF Downloads 1057652 A Multi Cordic Architecture on FPGA Platform
Authors: Ahmed Madian, Muaz Aljarhi
Abstract:
Coordinate Rotation Digital Computer (CORDIC) is a unique digital computing unit intended for the computation of mathematical operations and functions. This paper presents a multi-CORDIC processor that integrates different CORDIC architectures on a single FPGA chip and allows the user to select the CORDIC architecture to proceed with based on what he wants to calculate and his/her needs. Synthesis show that radix 2 CORDIC has the lowest clock delay, radix 8 CORDIC has the highest LUT usage and lowest register usage while Hybrid Radix 4 CORDIC had the highest clock delay.Keywords: multi, CORDIC, FPGA, processor
Procedia PDF Downloads 4707651 An Improved Total Variation Regularization Method for Denoising Magnetocardiography
Authors: Yanping Liao, Congcong He, Ruigang Zhao
Abstract:
The application of magnetocardiography signals to detect cardiac electrical function is a new technology developed in recent years. The magnetocardiography signal is detected with Superconducting Quantum Interference Devices (SQUID) and has considerable advantages over electrocardiography (ECG). It is difficult to extract Magnetocardiography (MCG) signal which is buried in the noise, which is a critical issue to be resolved in cardiac monitoring system and MCG applications. In order to remove the severe background noise, the Total Variation (TV) regularization method is proposed to denoise MCG signal. The approach transforms the denoising problem into a minimization optimization problem and the Majorization-minimization algorithm is applied to iteratively solve the minimization problem. However, traditional TV regularization method tends to cause step effect and lacks constraint adaptability. In this paper, an improved TV regularization method for denoising MCG signal is proposed to improve the denoising precision. The improvement of this method is mainly divided into three parts. First, high-order TV is applied to reduce the step effect, and the corresponding second derivative matrix is used to substitute the first order. Then, the positions of the non-zero elements in the second order derivative matrix are determined based on the peak positions that are detected by the detection window. Finally, adaptive constraint parameters are defined to eliminate noises and preserve signal peak characteristics. Theoretical analysis and experimental results show that this algorithm can effectively improve the output signal-to-noise ratio and has superior performance.Keywords: constraint parameters, derivative matrix, magnetocardiography, regular term, total variation
Procedia PDF Downloads 1537650 Motivational Orientation of the Methodical System of Teaching Mathematics in Secondary Schools
Authors: M. Rodionov, Z. Dedovets
Abstract:
The article analyses the composition and structure of the motivationally oriented methodological system of teaching mathematics (purpose, content, methods, forms, and means of teaching), viewed through the prism of the student as the subject of the learning process. Particular attention is paid to the problem of methods of teaching mathematics, which are represented in the form of an ordered triad of attributes corresponding to the selected characteristics. A systematic analysis of possible options and their methodological interpretation enriched existing ideas about known methods and technologies of training, and significantly expanded their nomenclature by including previously unstudied combinations of characteristics. In addition, examples outlined in this article illustrate the possibilities of enhancing the motivational capacity of a particular method or technology in the real learning practice of teaching mathematics through more free goal-setting and varying the conditions of the problem situations. The authors recommend the implementation of different strategies according to their characteristics in teaching and learning mathematics in secondary schools.Keywords: education, methodological system, the teaching of mathematics, students motivation
Procedia PDF Downloads 3547649 Presenting a Job Scheduling Algorithm Based on Learning Automata in Computational Grid
Authors: Roshanak Khodabakhsh Jolfaei, Javad Akbari Torkestani
Abstract:
As a cooperative environment for problem-solving, it is necessary that grids develop efficient job scheduling patterns with regard to their goals, domains and structure. Since the Grid environments facilitate distributed calculations, job scheduling appears in the form of a critical problem for the management of Grid sources that influences severely on the efficiency for the whole Grid environment. Due to the existence of some specifications such as sources dynamicity and conditions of the network in Grid, some algorithm should be presented to be adjustable and scalable with increasing the network growth. For this purpose, in this paper a job scheduling algorithm has been presented on the basis of learning automata in computational Grid which the performance of its results were compared with FPSO algorithm (Fuzzy Particle Swarm Optimization algorithm) and GJS algorithm (Grid Job Scheduling algorithm). The obtained numerical results indicated the superiority of suggested algorithm in comparison with FPSO and GJS. In addition, the obtained results classified FPSO and GJS in the second and third position respectively after the mentioned algorithm.Keywords: computational grid, job scheduling, learning automata, dynamic scheduling
Procedia PDF Downloads 3437648 Efficient Ground Targets Detection Using Compressive Sensing in Ground-Based Synthetic-Aperture Radar (SAR) Images
Authors: Gherbi Nabil
Abstract:
Detection of ground targets in SAR radar images is an important area for radar information processing. In the literature, various algorithms have been discussed in this context. However, most of them are of low robustness and accuracy. To this end, we discuss target detection in SAR images based on compressive sensing. Firstly, traditional SAR image target detection algorithms are discussed, and their limitations are highlighted. Secondly, a compressive sensing method is proposed based on the sparsity of SAR images. Next, the detection problem is solved using Multiple Measurements Vector configuration. Furthermore, a robust Alternating Direction Method of Multipliers (ADMM) is developed to solve the optimization problem. Finally, the detection results obtained using raw complex data are presented. Experimental results on real SAR images have verified the effectiveness of the proposed algorithm.Keywords: compressive sensing, raw complex data, synthetic aperture radar, ADMM
Procedia PDF Downloads 18