Search results for: crow search algorithm
707 A Deep Learning Approach to Detect Complete Safety Equipment for Construction Workers Based on YOLOv7
Authors: Shariful Islam, Sharun Akter Khushbu, S. M. Shaqib, Shahriar Sultan Ramit
Abstract:
In the construction sector, ensuring worker safety is of the utmost significance. In this study, a deep learning-based technique is presented for identifying safety gear worn by construction workers, such as helmets, goggles, jackets, gloves, and footwear. The suggested method precisely locates these safety items by using the YOLO v7 (You Only Look Once) object detection algorithm. The dataset utilized in this work consists of labeled images split into training, testing and validation sets. Each image has bounding box labels that indicate where the safety equipment is located within the image. The model is trained to identify and categorize the safety equipment based on the labeled dataset through an iterative training approach. We used custom dataset to train this model. Our trained model performed admirably well, with good precision, recall, and F1-score for safety equipment recognition. Also, the model's evaluation produced encouraging results, with a [email protected] score of 87.7%. The model performs effectively, making it possible to quickly identify safety equipment violations on building sites. A thorough evaluation of the outcomes reveals the model's advantages and points up potential areas for development. By offering an automatic and trustworthy method for safety equipment detection, this research contributes to the fields of computer vision and workplace safety. The proposed deep learning-based approach will increase safety compliance and reduce the risk of accidents in the construction industry.Keywords: deep learning, safety equipment detection, YOLOv7, computer vision, workplace safety
Procedia PDF Downloads 68706 An Audit on the Role of Sentinel Node Biopsy in High-Risk Ductal Carcinoma in Situ and Intracystic Papillary Carcinoma
Authors: M. Sulieman, H. Arabiyat, H. Ali, K. Potiszil, I. Abbas, R. English, P. King, I. Brown, P. Drew
Abstract:
Introduction: The incidence of breast ductal Carcinoma in Situ (DCIS) has been increasing; it currently represents up 20-25% of all breast carcinomas. Some aspects of DCIS management are still controversial, mainly due to the heterogeneity of its clinical presentation and of its biological and pathological characteristics. In DCIS, histological diagnosis obtained preoperatively, carries the risk of sampling error if the presence of invasive cancer is subsequently diagnosed. The mammographic extent over than 4–5 cm and the presence of architectural distortion, focal asymmetric density or mass on mammography are proven important risk factors of preoperative histological under staging. Intracystic papillary cancer (IPC) is a rare form of breast carcinoma. Despite being previously compared to DCIS it has been shown to present histologically with invasion of the basement membrane and even metastasis. SLNB – Carries the risk of associated comorbidity that should be considered when planning surgery for DCIS and IPC. Objectives: The aim of this Audit was to better define a ‘high risk’ group of patients with pre-op diagnosis of non-invasive cancer undergoing breast conserving surgery, who would benefit from sentinel node biopsy. Method: Retrospective data collection of all patients with ductal carcinoma in situ over 5 years. 636 patients identified, and after exclusion criteria applied: 394 patients were included. High risk defined as: Extensive micro-calcification >40mm OR any mass forming DCIS. IPC: Winpath search from for the term ‘papillary carcinoma’ in any breast specimen for 5 years duration;.29 patients were included in this group. Results: DCIS: 188 deemed high risk due to >40mm calcification or a mass forming (radiological or palpable) 61% of those had a mastectomy and 32% BCS. Overall, in that high-risk group - the number with invasive disease was 38%. Of those high-risk DCIS pts 85% had a SLN - 80% at the time of surgery and 5% at a second operation. For the BCS patients - 42% had SLN at time of surgery and 13% (8 patients) at a second operation. 15 (7.9%) pts in the high-risk group had a positive SLNB, 11 having a mastectomy and 4 having BCS. IPC: The provisional diagnosis of encysted papillary carcinoma is upgraded to an invasive carcinoma on final histology in around a third of cases. This has may have implications when deciding whether to offer sentinel node removal at the time of therapeutic surgery. Conclusions: We have defined a ‘high risk’ group of pts with pre-op diagnosis of non-invasive cancer undergoing BCS, who would benefit from SLNB at the time of the surgery. In patients with high-risk features; the risk of invasive disease is up to 40% but the risk of nodal involvement is approximately 8%. The risk of morbidity from SLN is up to about 5% especially the risk of lymphedema.Keywords: breast ductal carcinoma in Situ (DCIS), intracystic papillary carcinoma (IPC), sentinel node biopsy (SLNB), high-risk, non-invasive, cancer disease
Procedia PDF Downloads 111705 Analysis of Network Connectivity for Ship-To-Ship Maritime Communication Using IEEE 802.11 on Maritime Environment of Tanjung Perak, Indonesia
Authors: Ahmad Fauzi Makarim, Okkie Puspitorini, Hani'ah Mahmudah, Nur Adi Siswandari, Ari Wijayanti
Abstract:
As a maritime country, Indonesia needs a solution in maritime connectivity which can assist the maritime communication system which including communication from harbor to the ship or ship to ship. The needs of many application services for maritime communication, whether for safety reasons until voyage service to help the process of voyage activity needs connection with a high bandwith. To support the government efforts in handling that kind of problem, a research is conducted in maritime communication issue by applying the new developed technology in Indonesia, namely IEEE 802.11. In this research, 3 outdoor WiFi devices are used in which have a frequency of 5.8 GHz. Maritime of Tanjung Perak harbor in Surabaya until Karang Jamuang Island are used as the location of the research with defining permission of ship node spreading by Navigation District Class 1. That maritime area formed by state 1 and state 2 areas which are the narrow area with average wave height of 0.7 meter based on the data from BMKG S urabaya. After that, wave height used as one of the parameters which are used in analyzing characteristic of signal propagation at sea surface, so it can be determined on the coverage area of transmitter system. In this research has been used three samples of outdoor wifi, there is the coverage of device A can be determined about 2256 meter, device B 4000 meter, and device C 1174 meter. Then to analyze of network connectivity for the ship to ship is used AODV routing algorithm system based on the value of the power transmit was smallest of all nodes within the transmitter coverage.Keywords: maritime of Indonesia, maritime communications, outdoor wifi, coverage, AODV
Procedia PDF Downloads 351704 The Hypoglycaemic and Antioxidant Effects of Ethanolic Extract of Curcuma Longa Rhizomes Alone and with Two Pepper Adjuvants in Alloxan-Induced Diabetic Rats
Authors: J. O. Ezekwesili-Ofili, L. I. Okorafor, S. C. Nsofor
Abstract:
Diabetes mellitus is a carbohydrate metabolism disorder due to an absolute or relative deficiency of insulin secretion, action or both. Many known hypoglycaemic drugs are known to produce serious side effects. However, the search for safer and more effective agents has shifted to plant products, including foods and spices. One of such is the rhizome of Curcuma longa or turmeric, which is a spice with high medicinal value. A drawback in the use of C. longa is the poor bioavailability of curcumin, the active ingredient. It has been reported that piperine, an alkaloid present in peppers increases the bioavailability of curcumin. This work therefore investigated the hypoglycaemic and antioxidant effects of ethanolic extract of C. longa rhizomes, alone and with two pepper adjuvants in alloxan-induced diabetic rats. A total of 48 rats were divided into 6 groups of 8 rats each. Groups A–E were induced with diabetes using 150mg/kg body weight of alloxan monohydrate, while group F was normoglycaemic: Group A: Diabetic; fed with 400 mg/g body weight of turmeric extract; group B: Diabetic, fed with 400 mg/kg b. w. and 200mg/kg b. w of ethanolic extract of seeds of Piper guinensee; group C: Diabetic, fed with 400 mg/kg b. w. and 200 mg /kg b. w. of ethanolic extract of seeds of Capsicum annum var cameroun, group D: Diabetic, treated with standard drug, glibenclamide (0.3mg/kg body weight), group E: Diabetic; no treatment i.e. Positive control and group F: non diabetic, no treatment i.e. Negative control. Blood glucose levels were monitored for 14 days using a glucometer. The levels of the antioxidant enzymes; glutathione peroxidase, catalase and superoxide dismutase were also assayed in serum. The ethanolic extracts of C. longa rhizomes at the dose given (400 mg/kg b. w) significantly reduced the blood glucose levels of the diabetic rats (p<0.05) comparable to the standard drug. Co administration of extract of the peppers did not significantly increase the efficiency of the extract, although C. annum var cameroun showed greater effect, though not significantly. The antioxidant effect of the extract was significant in diabetic rats. The use of piperine-containing peppers enhanced the antioxidant effect. Phytochemical analyses of the ethanolic extract of C. longa showed the presence of alkaloids, flavonoids, steroids, saponins, tannins, glycosides, and terpenoids. These results suggest that the ethanolic extract of C. longa had antidiabetic with antioxidant effects and could thus be of benefit in the treatment and management of diabetes as well as ameliorate pro-oxidant effects that may lead to diabetic complications. However, while the addition of piperine did not affect the antidiabetic effect of C. longa, the antioxidant effect was greatly enhanced.Keywords: antioxidant, Curcuma longa rhizome, hypoglycaemic, pepper adjuvants, piperine
Procedia PDF Downloads 236703 Design of Robust and Intelligent Controller for Active Removal of Space Debris
Authors: Shabadini Sampath, Jinglang Feng
Abstract:
With huge kinetic energy, space debris poses a major threat to astronauts’ space activities and spacecraft in orbit if a collision happens. The active removal of space debris is required in order to avoid frequent collisions that would occur. In addition, the amount of space debris will increase uncontrollably, posing a threat to the safety of the entire space system. But the safe and reliable removal of large-scale space debris has been a huge challenge to date. While capturing and deorbiting space debris, the space manipulator has to achieve high control precision. However, due to uncertainties and unknown disturbances, there is difficulty in coordinating the control of the space manipulator. To address this challenge, this paper focuses on developing a robust and intelligent control algorithm that controls joint movement and restricts it on the sliding manifold by reducing uncertainties. A neural network adaptive sliding mode controller (NNASMC) is applied with the objective of finding the control law such that the joint motions of the space manipulator follow the given trajectory. A computed torque control (CTC) is an effective motion control strategy that is used in this paper for computing space manipulator arm torque to generate the required motion. Based on the Lyapunov stability theorem, the proposed intelligent controller NNASMC and CTC guarantees the robustness and global asymptotic stability of the closed-loop control system. Finally, the controllers used in the paper are modeled and simulated using MATLAB Simulink. The results are presented to prove the effectiveness of the proposed controller approach.Keywords: GNC, active removal of space debris, AI controllers, MatLabSimulink
Procedia PDF Downloads 132702 Trading off Accuracy for Speed in Powerdrill
Authors: Filip Buruiana, Alexander Hall, Reimar Hofmann, Thomas Hofmann, Silviu Ganceanu, Alexandru Tudorica
Abstract:
In-memory column-stores make interactive analysis feasible for many big data scenarios. PowerDrill is a system used internally at Google for exploration in logs data. Even though it is a highly parallelized column-store and uses in memory caching, interactive response times cannot be achieved for all datasets (note that it is common to analyze data with 50 billion records in PowerDrill). In this paper, we investigate two orthogonal approaches to optimize performance at the expense of an acceptable loss of accuracy. Both approaches can be implemented as outer wrappers around existing database engines and so they should be easily applicable to other systems. For the first optimization we show that memory is the limiting factor in executing queries at speed and therefore explore possibilities to improve memory efficiency. We adapt some of the theory behind data sketches to reduce the size of particularly expensive fields in our largest tables by a factor of 4.5 when compared to a standard compression algorithm. This saves 37% of the overall memory in PowerDrill and introduces a 0.4% relative error in the 90th percentile for results of queries with the expensive fields. We additionally evaluate the effects of using sampling on accuracy and propose a simple heuristic for annotating individual result-values as accurate (or not). Based on measurements of user behavior in our real production system, we show that these estimates are essential for interpreting intermediate results before final results are available. For a large set of queries this effectively brings down the 95th latency percentile from 30 to 4 seconds.Keywords: big data, in-memory column-store, high-performance SQL queries, approximate SQL queries
Procedia PDF Downloads 259701 Layout Optimization of a Start-up COVID-19 Testing Kit Manufacturing Facility
Authors: Poojan Vora, Hardik Pancholi, Sanket Tajane, Harsh Shah, Elias Keedy
Abstract:
The global COVID-19 pandemic has affected the industry drastically in many ways. Even though the vaccine is being distributed quickly and despite the decreasing number of positive cases, testing is projected to remain a key aspect of the ‘new normal’. Improving existing plant layout and improving safety within the facility are of great importance in today’s industries because of the need to ensure productivity optimization and reduce safety risks. In practice, it is essential for any manufacturing plant to reduce nonvalue adding steps such as the movement of materials and rearrange similar processes. In the current pandemic situation, optimized layouts will not only increase safety measures but also decrease the fixed cost per unit manufactured. In our case study, we carefully studied the existing layout and the manufacturing steps of a new Texas start-up company that manufactures COVID testing kits. The effects of production rate are incorporated with the computerized relative allocation of facilities technique (CRAFT) algorithm to improve the plant layout and estimate the optimization parameters. Our work reduces the company’s material handling time and increases their daily production. Real data from the company are used in the case study to highlight the importance of colleges in fostering small business needs and improving the collaboration between college researchers and industries by using existing models to advance best practices.Keywords: computerized relative allocation of facilities technique, facilities planning, optimization, start-up business
Procedia PDF Downloads 138700 Valorization of Lignocellulosic Wastes– Evaluation of Its Toxicity When Used in Adsorption Systems
Authors: Isabel Brás, Artur Figueirinha, Bruno Esteves, Luísa P. Cruz-Lopes
Abstract:
The agriculture lignocellulosic by-products are receiving increased attention, namely in the search for filter materials that retain contaminants from water. These by-products, specifically almond and hazelnut shells are abundant in Portugal once almond and hazelnuts production is a local important activity. Hazelnut and almond shells have as main constituents lignin, cellulose and hemicelluloses, water soluble extractives and tannins. Along the adsorption of heavy metals from contaminated waters, water soluble compounds can leach from shells and have a negative impact in the environment. Usually, the chemical characterization of treated water by itself may not show environmental impact caused by the discharges when parameters obey to legal quality standards for water. Only biological systems can detect the toxic effects of the water constituents. Therefore, the evaluation of toxicity by biological tests is very important when deciding the suitability for safe water discharge or for irrigation applications. The main purpose of the present work was to assess the potential impacts of waters after been treated for heavy metal removal by hazelnut and almond shells adsorption systems, with short term acute toxicity tests. To conduct the study, water at pH 6 with 25 mg.L-1 of lead, was treated with 10 g of shell per litre of wastewater, for 24 hours. This procedure was followed for each bark. Afterwards the water was collected for toxicological assays; namely bacterial resistance, seed germination, Lemna minor L. test and plant grow. The effect in isolated bacteria strains was determined by disc diffusion method and the germination index of seed was evaluated using lettuce, with temperature and humidity germination control for 7 days. For aquatic higher organism, Lemnas were used with 4 days contact time with shell solutions, in controlled light and temperature. For terrestrial higher plants, biomass production was evaluated after 14 days of tomato germination had occurred in soil, with controlled humidity, light and temperature. Toxicity tests of water treated with shells revealed in some extent effects in the tested organisms, with the test assays showing a close behaviour as the control, leading to the conclusion that its further utilization may not be considered to create a serious risk to the environment.Keywords: lignocellulosic wastes, adsorption, acute toxicity tests, risk assessment
Procedia PDF Downloads 366699 Laser Registration and Supervisory Control of neuroArm Robotic Surgical System
Authors: Hamidreza Hoshyarmanesh, Hosein Madieh, Sanju Lama, Yaser Maddahi, Garnette R. Sutherland, Kourosh Zareinia
Abstract:
This paper illustrates the concept of an algorithm to register specified markers on the neuroArm surgical manipulators, an image-guided MR-compatible tele-operated robot for microsurgery and stereotaxy. Two range-finding algorithms, namely time-of-flight and phase-shift, are evaluated for registration and supervisory control. The time-of-flight approach is implemented in a semi-field experiment to determine the precise position of a tiny retro-reflective moving object. The moving object simulates a surgical tool tip. The tool is a target that would be connected to the neuroArm end-effector during surgery inside the magnet bore of the MR imaging system. In order to apply flight approach, a 905-nm pulsed laser diode and an avalanche photodiode are utilized as the transmitter and receiver, respectively. For the experiment, a high frequency time to digital converter was designed using a field-programmable gate arrays. In the phase-shift approach, a continuous green laser beam with a wavelength of 530 nm was used as the transmitter. Results showed that a positioning error of 0.1 mm occurred when the scanner-target point distance was set in the range of 2.5 to 3 meters. The effectiveness of this non-contact approach exhibited that the method could be employed as an alternative for conventional mechanical registration arm. Furthermore, the approach is not limited by physical contact and extension of joint angles.Keywords: 3D laser scanner, intraoperative MR imaging, neuroArm, real time registration, robot-assisted surgery, supervisory control
Procedia PDF Downloads 286698 Aerodynamic Modeling Using Flight Data at High Angle of Attack
Authors: Rakesh Kumar, A. K. Ghosh
Abstract:
The paper presents the modeling of linear and nonlinear longitudinal aerodynamics using real flight data of Hansa-3 aircraft gathered at low and high angles of attack. The Neural-Gauss-Newton (NGN) method has been applied to model the linear and nonlinear longitudinal dynamics and estimate parameters from flight data. Unsteady aerodynamics due to flow separation at high angles of attack near stall has been included in the aerodynamic model using Kirchhoff’s quasi-steady stall model. NGN method is an algorithm that utilizes Feed Forward Neural Network (FFNN) and Gauss-Newton optimization to estimate the parameters and it does not require any a priori postulation of mathematical model or solving of equations of motion. NGN method was validated on real flight data generated at moderate angles of attack before application to the data at high angles of attack. The estimates obtained from compatible flight data using NGN method were validated by comparing with wind tunnel values and the maximum likelihood estimates. Validation was also carried out by comparing the response of measured motion variables with the response generated by using estimates a different control input. Next, NGN method was applied to real flight data generated by executing a well-designed quasi-steady stall maneuver. The results obtained in terms of stall characteristics and aerodynamic parameters were encouraging and reasonably accurate to establish NGN as a method for modeling nonlinear aerodynamics from real flight data at high angles of attack.Keywords: parameter estimation, NGN method, linear and nonlinear, aerodynamic modeling
Procedia PDF Downloads 445697 Optimization and Energy Management of Hybrid Standalone Energy System
Authors: T. M. Tawfik, M. A. Badr, E. Y. El-Kady, O. E. Abdellatif
Abstract:
Electric power shortage is a serious problem in remote rural communities in Egypt. Over the past few years, electrification of remote communities including efficient on-site energy resources utilization has achieved high progress. Remote communities usually fed from diesel generator (DG) networks because they need reliable energy and cheap fresh water. The main objective of this paper is to design an optimal economic power supply from hybrid standalone energy system (HSES) as alternative energy source. It covers energy requirements for reverse osmosis desalination unit (DU) located in National Research Centre farm in Noubarya, Egypt. The proposed system consists of PV panels, Wind Turbines (WT), Batteries, and DG as a backup for supplying DU load of 105.6 KWh/day rated power with 6.6 kW peak load operating 16 hours a day. Optimization of HSES objective is selecting the suitable size of each of the system components and control strategy that provide reliable, efficient, and cost-effective system using net present cost (NPC) as a criterion. The harmonization of different energy sources, energy storage, and load requirements are a difficult and challenging task. Thus, the performance of various available configurations is investigated economically and technically using iHOGA software that is based on genetic algorithm (GA). The achieved optimum configuration is further modified through optimizing the energy extracted from renewable sources. Effective minimization of energy charging the battery ensures that most of the generated energy directly supplies the demand, increasing the utilization of the generated energy.Keywords: energy management, hybrid system, renewable energy, remote area, optimization
Procedia PDF Downloads 199696 Simulation, Optimization, and Analysis Approach of Microgrid Systems
Authors: Saqib Ali
Abstract:
Sources are classified into two depending upon the factor of reviving. These sources, which cannot be revived into their original shape once they are consumed, are considered as nonrenewable energy resources, i.e., (coal, fuel) Moreover, those energy resources which are revivable to the original condition even after being consumed are known as renewable energy resources, i.e., (wind, solar, hydel) Renewable energy is a cost-effective way to generate clean and green electrical energy Now a day’s majority of the countries are paying heed to energy generation from RES Pakistan is mostly relying on conventional energy resources which are mostly nonrenewable in nature coal, fuel is one of the major resources, and with the advent of time their prices are increasing on the other hand RES have great potential in the country with the deployment of RES greater reliability and an effective power system can be obtained In this thesis, a similar concept is being used and a hybrid power system is proposed which is composed of intermixing of renewable and nonrenewable sources The Source side is composed of solar, wind, fuel cells which will be used in an optimal manner to serve load The goal is to provide an economical, reliable, uninterruptable power supply. This is achieved by optimal controller (PI, PD, PID, FOPID) Optimization techniques are applied to the controllers to achieve the desired results. Advanced algorithms (Particle swarm optimization, Flower Pollination Algorithm) will be used to extract the desired output from the controller Detailed comparison in the form of tables and results will be provided, which will highlight the efficiency of the proposed system.Keywords: distributed generation, demand-side management, hybrid power system, micro grid, renewable energy resources, supply-side management
Procedia PDF Downloads 97695 Multi-Stage Classification for Lung Lesion Detection on CT Scan Images Applying Medical Image Processing Technique
Authors: Behnaz Sohani, Sahand Shahalinezhad, Amir Rahmani, Aliyu Aliyu
Abstract:
Recently, medical imaging and specifically medical image processing is becoming one of the most dynamically developing areas of medical science. It has led to the emergence of new approaches in terms of the prevention, diagnosis, and treatment of various diseases. In the process of diagnosis of lung cancer, medical professionals rely on computed tomography (CT) scans, in which failure to correctly identify masses can lead to incorrect diagnosis or sampling of lung tissue. Identification and demarcation of masses in terms of detecting cancer within lung tissue are critical challenges in diagnosis. In this work, a segmentation system in image processing techniques has been applied for detection purposes. Particularly, the use and validation of a novel lung cancer detection algorithm have been presented through simulation. This has been performed employing CT images based on multilevel thresholding. The proposed technique consists of segmentation, feature extraction, and feature selection and classification. More in detail, the features with useful information are selected after featuring extraction. Eventually, the output image of lung cancer is obtained with 96.3% accuracy and 87.25%. The purpose of feature extraction applying the proposed approach is to transform the raw data into a more usable form for subsequent statistical processing. Future steps will involve employing the current feature extraction method to achieve more accurate resulting images, including further details available to machine vision systems to recognise objects in lung CT scan images.Keywords: lung cancer detection, image segmentation, lung computed tomography (CT) images, medical image processing
Procedia PDF Downloads 101694 Evidence Theory Based Emergency Multi-Attribute Group Decision-Making: Application in Facility Location Problem
Authors: Bidzina Matsaberidze
Abstract:
It is known that, in emergency situations, multi-attribute group decision-making (MAGDM) models are characterized by insufficient objective data and a lack of time to respond to the task. Evidence theory is an effective tool for describing such incomplete information in decision-making models when the expert and his knowledge are involved in the estimations of the MAGDM parameters. We consider an emergency decision-making model, where expert assessments on humanitarian aid from distribution centers (HADC) are represented in q-rung ortho-pair fuzzy numbers, and the data structure is described within the data body theory. Based on focal probability construction and experts’ evaluations, an objective function-distribution centers’ selection ranking index is constructed. Our approach for solving the constructed bicriteria partitioning problem consists of two phases. In the first phase, based on the covering’s matrix, we generate a matrix, the columns of which allow us to find all possible partitionings of the HADCs with the service centers. Some constraints are also taken into consideration while generating the matrix. In the second phase, based on the matrix and using our exact algorithm, we find the partitionings -allocations of the HADCs to the centers- which correspond to the Pareto-optimal solutions. For an illustration of the obtained results, a numerical example is given for the facility location-selection problem.Keywords: emergency MAGDM, q-rung orthopair fuzzy sets, evidence theory, HADC, facility location problem, multi-objective combinatorial optimization problem, Pareto-optimal solutions
Procedia PDF Downloads 92693 Sustainability from Ecocity to Ecocampus: An Exploratory Study on Spanish Universities' Water Management
Authors: Leyla A. Sandoval Hamón, Fernando Casani
Abstract:
Sustainability has been integrated into the cities’ agenda due to the impact that they generate. The dimensions of greater proliferation of sustainability, which are taken as a reference, are economic, social and environmental. Thus, the decisions of management of the sustainable cities search a balance between these dimensions in order to provide environment-friendly alternatives. In this context, urban models (where water consumption, energy consumption, waste production, among others) that have emerged in harmony with the environment, are known as Ecocity. A similar model, but on a smaller scale, is ‘Ecocampus’ that is developed in universities (considered ‘small cities’ due to its complex structure). So, sustainable practices are being implemented in the management of university campus activities, following different relevant lines of work. The universities have a strategic role in society, and their activities can strengthen policies, strategies, and measures of sustainability, both internal and external to the organization. Because of their mission in knowledge creation and transfer, these institutions can promote and disseminate more advanced activities in sustainability. This model replica also implies challenges in the sustainable management of water, energy, waste, transportation, among others, inside the campus. The challenge that this paper focuses on is the water management, taking into account that the universities consume big amounts of this resource. The purpose of this paper is to analyze the sustainability experience, with emphasis on water management, of two different campuses belonging to two different Spanish universities - one urban campus in a historic city and the other a suburban campus in the outskirts of a large city. Both universities are in the top hundred of international rankings of sustainable universities. The methodology adopts a qualitative method based on the technique of in-depth interviews and focus-group discussions with administrative and academic staff of the ‘Ecocampus’ offices, the organizational units for sustainability management, from the two Spanish universities. The hypotheses indicate that sustainable policies in terms of water management are best in campuses without big green spaces and where the buildings are built or rebuilt with modern style. The sustainability efforts of the university are independent of the kind of (urban – suburban) campus but an important aspect to improve is the degree of awareness of the university community about water scarcity. In general, the paper suggests that higher institutions adapt their sustainability policies depending on the location and features of the campus and their engagement with the water conservation. Many Spanish universities have proposed policies, good practices, and measures of sustainability. In fact, some offices or centers of Ecocampus have been founded. The originality of this study is to learn from the different experiences of sustainability policies of universities.Keywords: ecocampus, ecocity, sustainability, water management
Procedia PDF Downloads 221692 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria
Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov
Abstract:
This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model
Procedia PDF Downloads 64691 Barnard Feature Point Detector for Low-Contractperiapical Radiography Image
Authors: Chih-Yi Ho, Tzu-Fang Chang, Chih-Chia Huang, Chia-Yen Lee
Abstract:
In dental clinics, the dentists use the periapical radiography image to assess the effectiveness of endodontic treatment of teeth with chronic apical periodontitis. Periapical radiography images are taken at different times to assess alveolar bone variation before and after the root canal treatment, and furthermore to judge whether the treatment was successful. Current clinical assessment of apical tissue recovery relies only on dentist personal experience. It is difficult to have the same standard and objective interpretations due to the dentist or radiologist personal background and knowledge. If periapical radiography images at the different time could be registered well, the endodontic treatment could be evaluated. In the image registration area, it is necessary to assign representative control points to the transformation model for good performances of registration results. However, detection of representative control points (feature points) on periapical radiography images is generally very difficult. Regardless of which traditional detection methods are practiced, sufficient feature points may not be detected due to the low-contrast characteristics of the x-ray image. Barnard detector is an algorithm for feature point detection based on grayscale value gradients, which can obtain sufficient feature points in the case of gray-scale contrast is not obvious. However, the Barnard detector would detect too many feature points, and they would be too clustered. This study uses the local extrema of clustering feature points and the suppression radius to overcome the problem, and compared different feature point detection methods. In the preliminary result, the feature points could be detected as representative control points by the proposed method.Keywords: feature detection, Barnard detector, registration, periapical radiography image, endodontic treatment
Procedia PDF Downloads 442690 Unsupervised Echocardiogram View Detection via Autoencoder-Based Representation Learning
Authors: Andrea Treviño Gavito, Diego Klabjan, Sanjiv J. Shah
Abstract:
Echocardiograms serve as pivotal resources for clinicians in diagnosing cardiac conditions, offering non-invasive insights into a heart’s structure and function. When echocardiographic studies are conducted, no standardized labeling of the acquired views is performed. Employing machine learning algorithms for automated echocardiogram view detection has emerged as a promising solution to enhance efficiency in echocardiogram use for diagnosis. However, existing approaches predominantly rely on supervised learning, necessitating labor-intensive expert labeling. In this paper, we introduce a fully unsupervised echocardiographic view detection framework that leverages convolutional autoencoders to obtain lower dimensional representations and the K-means algorithm for clustering them into view-related groups. Our approach focuses on discriminative patches from echocardiographic frames. Additionally, we propose a trainable inverse average layer to optimize decoding of average operations. By integrating both public and proprietary datasets, we obtain a marked improvement in model performance when compared to utilizing a proprietary dataset alone. Our experiments show boosts of 15.5% in accuracy and 9.0% in the F-1 score for frame-based clustering, and 25.9% in accuracy and 19.8% in the F-1 score for view-based clustering. Our research highlights the potential of unsupervised learning methodologies and the utilization of open-sourced data in addressing the complexities of echocardiogram interpretation, paving the way for more accurate and efficient cardiac diagnoses.Keywords: artificial intelligence, echocardiographic view detection, echocardiography, machine learning, self-supervised representation learning, unsupervised learning
Procedia PDF Downloads 32689 An Amended Method for Assessment of Hypertrophic Scars Viscoelastic Parameters
Authors: Iveta Bryjova
Abstract:
Recording of viscoelastic strain-vs-time curves with the aid of the suction method and a follow-up analysis, resulting into evaluation of standard viscoelastic parameters, is a significant technique for non-invasive contact diagnostics of mechanical properties of skin and assessment of its conditions, particularly in acute burns, hypertrophic scarring (the most common complication of burn trauma) and reconstructive surgery. For elimination of the skin thickness contribution, usable viscoelastic parameters deduced from the strain-vs-time curves are restricted to the relative ones (i.e. those expressed as a ratio of two dimensional parameters), like grosselasticity, net-elasticity, biological elasticity or Qu’s area parameters, in literature and practice conventionally referred to as R2, R5, R6, R7, Q1, Q2, and Q3. With the exception of parameters R2 and Q1, the remaining ones substantially depend on the position of inflection point separating the elastic linear and viscoelastic segments of the strain-vs-time curve. The standard algorithm implemented in commercially available devices relies heavily on the experimental fact that the inflection time comes about 0.1 sec after the suction switch-on/off, which depreciates credibility of parameters thus obtained. Although the Qu’s US 7,556,605 patent suggests a method of improving the precision of the inflection determination, there is still room for nonnegligible improving. In this contribution, a novel method of inflection point determination utilizing the advantageous properties of the Savitzky–Golay filtering is presented. The method allows computation of derivatives of smoothed strain-vs-time curve, more exact location of inflection and consequently more reliable values of aforementioned viscoelastic parameters. An improved applicability of the five inflection-dependent relative viscoelastic parameters is demonstrated by recasting a former study under the new method, and by comparing its results with those provided by the methods that have been used so far.Keywords: Savitzky–Golay filter, scarring, skin, viscoelasticity
Procedia PDF Downloads 304688 Analysing Trends in Rice Cropping Intensity and Seasonality across the Philippines Using 14 Years of Moderate Resolution Remote Sensing Imagery
Authors: Bhogendra Mishra, Andy Nelson, Mirco Boschetti, Lorenzo Busetto, Alice Laborte
Abstract:
Rice is grown on over 100 million hectares in almost every country of Asia. It is the most important staple crop for food security and has high economic and cultural importance in Asian societies. The combination of genetic diversity and management options, coupled with the large geographic extent means that there is a large variation in seasonality (when it is grown) and cropping intensity (how often it is grown per year on the same plot of land), even over relatively small distances. Seasonality and intensity can and do change over time depending on climatic, environmental and economic factors. Detecting where and when these changes happen can provide information to better understand trends in regional and even global rice production. Remote sensing offers a unique opportunity to estimate these trends. We apply the recently published PhenoRice algorithm to 14 years of moderate resolution remote sensing (MODIS) data (utilizing 250m resolution 16 day composites from Terra and Aqua) to estimate seasonality and cropping intensity per year and changes over time. We compare the results to the surveyed data collected by International Rice Research Institute (IRRI). The study results in a unique and validated dataset on the extent and change of extent, the seasonality and change in seasonality and the cropping intensity and change in cropping intensity between 2003 and 2016 for the Philippines. Observed trends and their implications for food security and trade policies are also discussed.Keywords: rice, cropping intensity, moderate resolution remote sensing (MODIS), phenology, seasonality
Procedia PDF Downloads 306687 Machine Learning Prediction of Compressive Damage and Energy Absorption in Carbon Fiber-Reinforced Polymer Tubular Structures
Authors: Milad Abbasi
Abstract:
Carbon fiber-reinforced polymer (CFRP) composite structures are increasingly being utilized in the automotive industry due to their lightweight and specific energy absorption capabilities. Although it is impossible to predict composite mechanical properties directly using theoretical methods, various research has been conducted so far in the literature for accurate simulation of CFRP structures' energy-absorbing behavior. In this research, axial compression experiments were carried out on hand lay-up unidirectional CFRP composite tubes. The fabrication method allowed the authors to extract the material properties of the CFRPs using ASTM D3039, D3410, and D3518 standards. A neural network machine learning algorithm was then utilized to build a robust prediction model to forecast the axial compressive properties of CFRP tubes while reducing high-cost experimental efforts. The predicted results have been compared with the experimental outcomes in terms of load-carrying capacity and energy absorption capability. The results showed high accuracy and precision in the prediction of the energy-absorption capacity of the CFRP tubes. This research also demonstrates the effectiveness and challenges of machine learning techniques in the robust simulation of composites' energy-absorption behavior. Interestingly, the proposed method considerably condensed numerical and experimental efforts in the simulation and calibration of CFRP composite tubes subjected to compressive loading.Keywords: CFRP composite tubes, energy absorption, crushing behavior, machine learning, neural network
Procedia PDF Downloads 153686 Suppressing Vibration in a Three-axis Flexible Satellite: An Approach with Composite Control
Authors: Jalal Eddine Benmansour, Khouane Boulanoir, Nacera Bekhadda, Elhassen Benfriha
Abstract:
This paper introduces a novel composite control approach that addresses the challenge of stabilizing the three-axis attitude of a flexible satellite in the presence of vibrations caused by flexible appendages. The key contribution of this research lies in the development of a disturbance observer, which effectively observes and estimates the unwanted torques induced by the vibrations. By utilizing the estimated disturbance, the proposed approach enables efficient compensation for the detrimental effects of vibrations on the satellite system. To govern the attitude angles of the spacecraft, a proportional derivative controller (PD) is specifically designed and proposed. The PD controller ensures precise control over all attitude angles, facilitating stable and accurate spacecraft maneuvering. In order to demonstrate the global stability of the system, the Lyapunov method, a well-established technique in control theory, is employed. Through rigorous analysis, the Lyapunov method verifies the convergence of system dynamics, providing strong evidence of system stability. To evaluate the performance and efficacy of the proposed control algorithm, extensive simulations are conducted. The simulation results validate the effectiveness of the combined approach, showcasing significant improvements in the stabilization and control of the satellite's attitude, even in the presence of disruptive vibrations from flexible appendages. This novel composite control approach presented in this paper contributes to the advancement of satellite attitude control techniques, offering a promising solution for achieving enhanced stability and precision in challenging operational environments.Keywords: attitude control, flexible satellite, vibration control, disturbance observer
Procedia PDF Downloads 86685 Torn Between the Lines of Border: The Pakhtuns of Pakistan and Afghanistan in Search of Identity
Authors: Priyanka Dutta Chowdhury
Abstract:
A globalized connected world, calling loud for a composite culture, was still not able to erase the pain of a desired nationalism based on cultural identity. In the South Asian region, the random drawing of the boundaries without taking the ethnic aspect into consideration have always challenged the very basis of the existence of certain groups. The urge to reunify with the fellow brothers on both sides of the border have always called for a chaos and schism in the countries of this region. Sometimes this became a tool to bargain with the state and find a favorable position in the power structure on the basis of cultural identity. In Pakistan and Afghanistan, the Pakhtuns who are divided across the border of the two countries, from the inception of creation of Pakistan have posed various challenges and hampered the growth of a consolidated nation. The Pakhtuns or Pashtuns of both Pakistan and Afghanistan have a strong cultural affinity which blurs their physical distancing and calls for a nationalism based on this ethnic affiliation. Both the sides wanted to create Pakhtunistan unifying all the Pakhtuns of the region. For long, this group have denied to accept the Durand line separating the two. This was an area of concern especially for the Pakhtuns of Pakistan torn between the decision either to join Afghanistan, create a nation of their own or be a part of Pakistan. This ethnic issue became a bone of contention between the two countries. Later, though well absorbed and recognized in the respective countries, they have fought for their identity and claimed for a dominant position in the politics of the nations. Because of the porous borders often influx of refugees was seen especially during Afghan Wars and later many extremists’ groups were born from them especially the Taliban. In the recent string of events, when the Taliban, who are mostly Pakhtuns ethnically, came in power in Afghanistan, a wave of sympathy arose in Pakistan. This gave a strengthening position to the religious Pakhtuns across the border. It is to be noted here that a fragmented Pakhtun identity between the religious and seculars were clearly visible, voicing for their place in the political hierarchy of the country with a vision distinct from each other especially in Pakistan. In this context the paper tries to evaluate the reasons for this cultural turmoil between the countries and this ethnic group. It also aims to analyze the concept of how the identity politics still holds its relevance in the contemporary world. Additionally, the recent trend of fragmented identity points towards instrumentalization of this ethnic groups, who are engaged in the bargaining process with the state for a robust position in the power structure. In the end, the paper aims to deduct from the theoretical conditions of identity politics, whether this is a primordial or a situational tool to have a visibility in the power structure of the contemporary world.Keywords: cultural identity, identity politics, instrumentalization of identity pakhtuns, power structure
Procedia PDF Downloads 82684 Compressed Natural Gas (CNG) Injector Research for Dual Fuel Engine
Authors: Adam Majczak, Grzegorz Barański, Marcin Szlachetka
Abstract:
Environmental considerations necessitate the search for new energy sources. One of the available solutions is a partial replacement of diesel fuel by compressed natural gas (CNG) in the compression ignition engines. This type of the engines is used mainly in vans and trucks. These units are also gaining more and more popularity in the passenger car market. In Europe, this part of the market share reaches 50%. Diesel engines are also used in industry in such vehicles as ship or locomotives. Diesel engines have higher emissions of nitrogen oxides in comparison to spark ignition engines. This can be currently limited by optimizing the combustion process and the use of additional systems such as exhaust gas recirculation or AdBlue technology. As a result of the combustion process of diesel fuel also particulate matter (PM) that are harmful to the human health are emitted. Their emission is limited by the use of a particulate filter. One of the method for toxic components emission reduction may be the use of liquid gas fuel such as propane and butane (LPG) or compressed natural gas (CNG). In addition to the environmental aspects, there are also economic reasons for the use of gaseous fuels to power diesel engines. A total or partial replacement of diesel gas is possible. Depending on the used technology and the percentage of diesel fuel replacement, it is possible to reduce the content of nitrogen oxides in the exhaust gas even by 30%, particulate matter (PM) by 95 % carbon monoxide and by 20%, in relation to original diesel fuel. The research object is prototype gas injector designed for direct injection of compressed natural gas (CNG) in compression ignition engines. The construction of the injector allows for it positioning in the glow plug socket, so that the gas is injected directly into the combustion chamber. The cycle analysis of the four-cylinder Andoria ADCR engine with a capacity of 2.6 dm3 for different crankshaft rotational speeds allowed to determine the necessary time for fuel injection. Because of that, it was possible to determine the required mass flow rate of the injector, for replacing as much of the original fuel by gaseous fuel. To ensure a high value of flow inside the injector, supply pressure equal to 1 MPa was applied. High gas supply pressure requires high value of valve opening forces. For this purpose, an injector with hydraulic control system, using a liquid under pressure for the opening process was designed. On the basis of air pressure measurements in the flow line after the injector, the analysis of opening and closing of the valve was made. Measurements of outflow mass of the injector were also carried out. The results showed that the designed injector meets the requirements necessary to supply ADCR engine by the CNG fuel.Keywords: CNG, diesel engine, gas flow, gas injector
Procedia PDF Downloads 493683 Prenatal Genetic Screening and Counselling Competency Challenges of Nurse-Midwife
Authors: Girija Madhavanprabhakaran, Frincy Franacis, Sheeba Elizabeth John
Abstract:
Introduction: A wide range of prenatal genetic screening is introduced with increasing incidences of congenital anomalies even in low-risk pregnancies and is an emerging standard of care. Being frontline caretakers, the role and responsibilities of nurses and midwives are critical as they are working along with couples to provide evidence-based supportive educative care. The increasing genetic disorders and advances in prenatal genetic screening with limited genetic counselling facilities urge nurses and midwifery nurses with essential competencies to help couples to take informed decision. Objective: This integrative literature review aimed to explore nurse midwives’ knowledge and role in prenatal screening and genetic counselling competency and the challenges faced by them to cater to all pregnant women to empower their autonomy in decision making and ensuring psychological comfort. Method: An electronic search using keywords prenatal screening, genetic counselling, prenatal counselling, nurse midwife, nursing education, genetics, and genomics were done in the PUBMED, SCOPUS and Medline, Google Scholar. Finally, based on inclusion criteria, 8 relevant articles were included. Results: The main review results suggest that nurses and midwives lack essential support, knowledge, or confidence to be able to provide genetic counselling and help the couples ethically to ensure client autonomy and decision making. The majority of nurses and midwives reported inadequate levels of knowledge on genetic screening and their roles in obtaining family history, pedigrees, and providing genetic information for an affected client or high-risk families. The deficiency of well-recognized and influential clinical academic midwives in midwifery practice is also reported. Evidence recommended to update and provide sound educational training to improve nurse-midwife competence and confidence. Conclusion: Overcoming the challenges to achieving informed choices about fetal anomaly screening globally is a major concern. Lack of adequate knowledge and counselling competency, communication insufficiency, need for education and policy are major areas to address. Prenatal nurses' and midwives’ knowledge on prenatal genetic screening and essential counselling competencies can ensure services to the majority of pregnant women around the globe to be better-informed decision-makers and enhances their autonomy, and reduces ethical dilemmas.Keywords: challenges, genetic counselling, prenatal screening, prenatal counselling
Procedia PDF Downloads 200682 Harmonic Assessment and Mitigation in Medical Diagonesis Equipment
Authors: S. S. Adamu, H. S. Muhammad, D. S. Shuaibu
Abstract:
Poor power quality in electrical power systems can lead to medical equipment at healthcare centres to malfunction and present wrong medical diagnosis. Equipment such as X-rays, computerized axial tomography, etc. can pollute the system due to their high level of harmonics production, which may cause a number of undesirable effects like heating, equipment damages and electromagnetic interferences. The conventional approach of mitigation uses passive inductor/capacitor (LC) filters, which has some drawbacks such as, large sizes, resonance problems and fixed compensation behaviours. The current trends of solutions generally employ active power filters using suitable control algorithms. This work focuses on assessing the level of Total Harmonic Distortion (THD) on medical facilities and various ways of mitigation, using radiology unit of an existing hospital as a case study. The measurement of the harmonics is conducted with a power quality analyzer at the point of common coupling (PCC). The levels of measured THD are found to be higher than the IEEE 519-1992 standard limits. The system is then modelled as a harmonic current source using MATLAB/SIMULINK. To mitigate the unwanted harmonic currents a shunt active filter is developed using synchronous detection algorithm to extract the fundamental component of the source currents. Fuzzy logic controller is then developed to control the filter. The THD without the active power filter are validated using the measured values. The THD with the developed filter show that the harmonics are now within the recommended limits.Keywords: power quality, total harmonics distortion, shunt active filters, fuzzy logic
Procedia PDF Downloads 479681 Research on Level Adjusting Mechanism System of Large Space Environment Simulator
Authors: Han Xiao, Zhang Lei, Huang Hai, Lv Shizeng
Abstract:
Space environment simulator is a device for spacecraft test. KM8 large space environment simulator built in Tianjing Space City is the largest as well as the most advanced space environment simulator in China. Large deviation of spacecraft level will lead to abnormally work of the thermal control device in spacecraft during the thermal vacuum test. In order to avoid thermal vacuum test failure, level adjusting mechanism system is developed in the KM8 large space environment simulator as one of the most important subsystems. According to the level adjusting requirements of spacecraft’s thermal vacuum tests, the four fulcrums adjusting model is established. By means of collecting level instruments and displacement sensors data, stepping motors controlled by PLC drive four supporting legs simultaneous movement. In addition, a PID algorithm is used to control the temperature of supporting legs and level instruments which long time work under the vacuum cold and black environment in KM8 large space environment simulator during thermal vacuum tests. Based on the above methods, the data acquisition and processing, the analysis and calculation, real time adjustment and fault alarming of the level adjusting mechanism system are implemented. The level adjusting accuracy reaches 1mm/m, and carrying capacity is 20 tons. Debugging showed that the level adjusting mechanism system of KM8 large space environment simulator can meet the thermal vacuum test requirement of the new generation spacecraft. The performance and technical indicators of the level adjusting mechanism system which provides important support for the development of spacecraft in China have been ahead of similar equipment in the world.Keywords: space environment simulator, thermal vacuum test, level adjusting, spacecraft, parallel mechanism
Procedia PDF Downloads 247680 One Step Further: Pull-Process-Push Data Processing
Authors: Romeo Botes, Imelda Smit
Abstract:
In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list
Procedia PDF Downloads 244679 Decision Support System Based On GIS and MCDM to Identify Land Suitability for Agriculture
Authors: Abdelkader Mendas
Abstract:
The integration of MultiCriteria Decision Making (MCDM) approaches in a Geographical Information System (GIS) provides a powerful spatial decision support system which offers the opportunity to efficiently produce the land suitability maps for agriculture. Indeed, GIS is a powerful tool for analyzing spatial data and establishing a process for decision support. Because of their spatial aggregation functions, MCDM methods can facilitate decision making in situations where several solutions are available, various criteria have to be taken into account and decision-makers are in conflict. The parameters and the classification system used in this work are inspired from the FAO (Food and Agriculture Organization) approach dedicated to a sustainable agriculture. A spatial decision support system has been developed for establishing the land suitability map for agriculture. It incorporates the multicriteria analysis method ELECTRE Tri (ELimitation Et Choix Traduisant la REalité) in a GIS within the GIS program package environment. The main purpose of this research is to propose a conceptual and methodological framework for the combination of GIS and multicriteria methods in a single coherent system that takes into account the whole process from the acquisition of spatially referenced data to decision-making. In this context, a spatial decision support system for developing land suitability maps for agriculture has been developed. The algorithm of ELECTRE Tri is incorporated into a GIS environment and added to the other analysis functions of GIS. This approach has been tested on an area in Algeria. A land suitability map for durum wheat has been produced. Through the obtained results, it appears that ELECTRE Tri method, integrated into a GIS, is better suited to the problem of land suitability for agriculture. The coherence of the obtained maps confirms the system effectiveness.Keywords: multicriteria decision analysis, decision support system, geographical information system, land suitability for agriculture
Procedia PDF Downloads 638678 Weakly Solving Kalah Game Using Artificial Intelligence and Game Theory
Authors: Hiba El Assibi
Abstract:
This study aims to weakly solve Kalah, a two-player board game, by developing a start-to-finish winning strategy using an optimized Minimax algorithm with Alpha-Beta Pruning. In weakly solving Kalah, our focus is on creating an optimal strategy from the game's beginning rather than analyzing every possible position. The project will explore additional enhancements like symmetry checking and code optimizations to speed up the decision-making process. This approach is expected to give insights into efficient strategy formulation in board games and potentially help create games with a fair distribution of outcomes. Furthermore, this research provides a unique perspective on human versus Artificial Intelligence decision-making in strategic games. By comparing the AI-generated optimal moves with human choices, we can explore how seemingly advantageous moves can, in the long run, be harmful, thereby offering a deeper understanding of strategic thinking and foresight in games. Moreover, this paper discusses the evaluation of our strategy against existing methods, providing insights on performance and computational efficiency. We also discuss the scalability of our approach to the game, considering different board sizes (number of pits and stones) and rules (different variations) and studying how that affects performance and complexity. The findings have potential implications for the development of AI applications in strategic game planning, enhancing our understanding of human cognitive processes in game settings, and offer insights into creating balanced and engaging game experiences.Keywords: minimax, alpha beta pruning, transposition tables, weakly solving, game theory
Procedia PDF Downloads 55