Search results for: Artificialintelligence (AI) techniques
91 Relocation of Livestocks in Rural of Canakkale Province Using Remote Sensing and GIS
Authors: Melis Inalpulat, Levent Genc, Unal Kizil, Tugce Civelek
Abstract:
Livestock production is one of the most important components of rural economy. Due to the urban expansion, rural areas close to expanding cities transform into urban districts during the time. However, the legislations have some restrictions related to livestock farming in such administrative units since they tend to create environmental concerns like odor problems resulted from excessive manure production. Therefore, the existing animal operations should be moved from the settlement areas. This paper was focused on determination of suitable lands for livestock production in Canakkale province of Turkey using remote sensing (RS) data and GIS techniques. To achieve the goal, Formosat 2 and Landsat 8 imageries, Aster DEM, and 1:25000 scaled soil maps, village boundaries, and village livestock inventory records were used. The study was conducted using suitability analysis which evaluates the land in terms of limitations and potentials, and suitability range was categorized as Suitable (S) and Non-Suitable (NS). Limitations included the distances from main and crossroads, water resources and settlements, while potentials were appropriate values for slope, land use capability and land use land cover status. Village-based S land distribution results were presented, and compared with livestock inventories. Results showed that approximately 44230 ha area is inappropriate because of the distance limitations for roads and etc. (NS). Moreover, according to LULC map, 71052 ha area consists of forests, olive and other orchards, and thus, may not be suitable for building such structures (NS). In comparison, it was found that there are a total of 1228 ha S lands within study area. The village-based findings indicated that, in some villages livestock production continues on NS areas. Finally, it was suggested that organized livestock zones may be constructed to serve in more than one village after the detailed analysis complemented considering also political decisions, opinion of the local people, etc.Keywords: GIS, livestock, LULC, remote sensing, suitable lands.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 132490 Spatial Data Science for Data Driven Urban Planning: The Youth Economic Discomfort Index for Rome
Authors: Iacopo Testi, Diego Pajarito, Nicoletta Roberto, Carmen Greco
Abstract:
Today, a consistent segment of the world’s population lives in urban areas, and this proportion will vastly increase in the next decades. Therefore, understanding the key trends in urbanization, likely to unfold over the coming years, is crucial to the implementation of sustainable urban strategies. In parallel, the daily amount of digital data produced will be expanding at an exponential rate during the following years. The analysis of various types of data sets and its derived applications have incredible potential across different crucial sectors such as healthcare, housing, transportation, energy, and education. Nevertheless, in city development, architects and urban planners appear to rely mostly on traditional and analogical techniques of data collection. This paper investigates the prospective of the data science field, appearing to be a formidable resource to assist city managers in identifying strategies to enhance the social, economic, and environmental sustainability of our urban areas. The collection of different new layers of information would definitely enhance planners' capabilities to comprehend more in-depth urban phenomena such as gentrification, land use definition, mobility, or critical infrastructural issues. Specifically, the research results correlate economic, commercial, demographic, and housing data with the purpose of defining the youth economic discomfort index. The statistical composite index provides insights regarding the economic disadvantage of citizens aged between 18 years and 29 years, and results clearly display that central urban zones and more disadvantaged than peripheral ones. The experimental set up selected the city of Rome as the testing ground of the whole investigation. The methodology aims at applying statistical and spatial analysis to construct a composite index supporting informed data-driven decisions for urban planning.
Keywords: Data science, spatial analysis, composite index, Rome, urban planning, youth economic discomfort index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 89989 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator
Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty
Abstract:
Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and Validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) where in the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation & Control design team. This paper discusses about the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.
Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), Steady State, Transient State.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 251888 Comparison between Open and Closed System for Dewatering with Geotextile: Field and Comparative Study
Authors: Matheus Müller, Delma Vidal
Abstract:
The present paper aims to expose two techniques of dewatering for sludge, analyzing its operations and dewatering processes, aiming at improving the conditions of disposal of residues with high liquid content. It describes the field tests performed on two geotextile systems, a closed geotextile tube and an open geotextile drying bed, both of which are submitted to two filling cycles. The sludge used in the filling cycles for the field trials is from the water treatment plant of the Technological Center of Aeronautics – CTA, in São José dos Campos, Brazil. Data about volume and height abatement due to the dewatering and consolidation were collected per time, until it was observed constancy. With the laboratory analysis of the sludge allied to the data collected in the field, it was possible to perform a critical comparative study between the observed and the scientific literature, in this way, this paper expresses the data obtained and compares them with the bibliography. The tests were carried out on three fronts: field tests, including the filling cycles of the systems with the sludge from CTA, taking measurements of filling time per cycle and maximum filling height per cycle, heights against the abatement by dewatering of the systems over time; tests carried out in the laboratory, including the characterization of the sludge and removal of material samples from the systems to ascertain the solids content within the systems per time and; comparing the data obtained in the field and laboratory tests with the scientific literature. Through the study, it was possible to perceive that the process of densification of the material inside a closed system, such as the geotextile tube, occurs faster than the observed in the drying bed system. This process of accelerated densification can be brought about by the pumping pressure of the sludge in its filling and by the confinement of the residue through the permeable geotextile membrane (allowing water to pass through), accelerating the process of densification and dewatering by its own weight after the filling with sludge.
Keywords: Consolidation, dewatering, geotextile drying bed, geotextile tube.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 68487 Sliding Mode Power System Stabilizer for Synchronous Generator Stability Improvement
Authors: J. Ritonja, R. Brezovnik, M. Petrun, B. Polajžer
Abstract:
Many modern synchronous generators in power systems are extremely weakly damped. The reasons are cost optimization of the machine building and introduction of the additional control equipment into power systems. Oscillations of the synchronous generators and related stability problems of the power systems are harmful and can lead to failures in operation and to damages. The only useful solution to increase damping of the unwanted oscillations represents the implementation of the power system stabilizers. Power system stabilizers generate the additional control signal which changes synchronous generator field excitation voltage. Modern power system stabilizers are integrated into static excitation systems of the synchronous generators. Available commercial power system stabilizers are based on linear control theory. Due to the nonlinear dynamics of the synchronous generator, current stabilizers do not assure optimal damping of the synchronous generator’s oscillations in the entire operating range. For that reason the use of the robust power system stabilizers which are convenient for the entire operating range is reasonable. There are numerous robust techniques applicable for the power system stabilizers. In this paper the use of sliding mode control for synchronous generator stability improvement is studied. On the basis of the sliding mode theory, the robust power system stabilizer was developed. The main advantages of the sliding mode controller are simple realization of the control algorithm, robustness to parameter variations and elimination of disturbances. The advantage of the proposed sliding mode controller against conventional linear controller was tested for damping of the synchronous generator oscillations in the entire operating range. Obtained results show the improved damping in the entire operating range of the synchronous generator and the increase of the power system stability. The proposed study contributes to the progress in the development of the advanced stabilizer, which will replace conventional linear stabilizers and improve damping of the synchronous generators.
Keywords: Control theory, power system stabilizer, robust control, sliding mode control, stability, synchronous generator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 106186 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models
Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu
Abstract:
Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.
Keywords: DTM, unmanned aerial vehicle, UAV, random, Kriging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 81085 Optimization of Shale Gas Production by Advanced Hydraulic Fracturing
Authors: Fazl Ullah, Rahmat Ullah
Abstract:
This paper shows a comprehensive learning focused on the optimization of gas production in shale gas reservoirs through hydraulic fracturing. Shale gas has emerged as an important unconventional vigor resource, necessitating innovative techniques to enhance its extraction. The key objective of this study is to examine the influence of fracture parameters on reservoir productivity and formulate strategies for production optimization. A sophisticated model integrating gas flow dynamics and real stress considerations is developed for hydraulic fracturing in multi-stage shale gas reservoirs. This model encompasses distinct zones: a single-porosity medium region, a dual-porosity average region, and a hydraulic fracture region. The apparent permeability of the matrix and fracture system is modeled using principles like effective stress mechanics, porous elastic medium theory, fractal dimension evolution, and fluid transport apparatuses. The developed model is then validated using field data from the Barnett and Marcellus formations, enhancing its reliability and accuracy. By solving the partial differential equation by means of COMSOL software, the research yields valuable insights into optimal fracture parameters. The findings reveal the influence of fracture length, diversion capacity, and width on gas production. For reservoirs with higher permeability, extending hydraulic fracture lengths proves beneficial, while complex fracture geometries offer potential for low-permeability reservoirs. Overall, this study contributes to a deeper understanding of hydraulic cracking dynamics in shale gas reservoirs and provides essential guidance for optimizing gas production. The research findings are instrumental for energy industry professionals, researchers, and policymakers alike, shaping the future of sustainable energy extraction from unconventional resources.
Keywords: Fluid-solid coupling, apparent permeability, shale gas reservoir, fracture property, numerical simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16984 A Study on the Application of Machine Learning and Deep Learning Techniques for Skin Cancer Detection
Authors: Hritwik Ghosh, Irfan Sadiq Rahat, Sachi Nandan Mohanty, J. V. R. Ravindra, Abdus Sobur
Abstract:
In the rapidly evolving landscape of medical diagnostics, the early detection and accurate classification of skin cancer remain paramount for effective treatment outcomes. This research delves into the transformative potential of artificial intelligence (AI), specifically deep learning (DL), as a tool for discerning and categorizing various skin conditions. Utilizing a diverse dataset of 3,000 images, representing nine distinct skin conditions, we confront the inherent challenge of class imbalance. This imbalance, where conditions like melanomas are over-represented, is addressed by incorporating class weights during the model training phase, ensuring an equitable representation of all conditions in the learning process. Our approach presents a hybrid model, amalgamating the strengths of two renowned convolutional neural networks (CNNs), VGG16 and ResNet50. These networks, pre-trained on the ImageNet dataset, are adept at extracting intricate features from images. By synergizing these models, our research aims to capture a holistic set of features, thereby bolstering classification performance. Preliminary findings underscore the hybrid model's superiority over individual models, showcasing its prowess in feature extraction and classification. Moreover, the research emphasizes the significance of rigorous data pre-processing, including image resizing, color normalization, and segmentation, in ensuring data quality and model reliability. In essence, this study illuminates the promising role of AI and DL in revolutionizing skin cancer diagnostics, offering insights into its potential applications in broader medical domains.
Keywords: Artificial intelligence, machine learning, deep learning, skin cancer, dermatology, convolutional neural networks, image classification, computer vision, healthcare technology, cancer detection, medical imaging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 144283 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks
Authors: Wang Yichen, Haruka Yamashita
Abstract:
In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.Keywords: Recurrent Neural Network, players lineup, basketball data, decision making model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 82882 Prevention of Corruption in Public Purchases
Authors: Anatoly Krivinsh
Abstract:
The results of dissertation research "Preventing and combating corruption in public procurement" are presented in this publication. The study was conducted 2011 till 2013 in a Member State of the European Union– in the Republic of Latvia. Goal of the thesis is to explore corruption prevention and combating issues in public procurement sphere, to identify the prevalence rates, determinants and contributing factors and prevention opportunities in Latvia. In the first chapter the author analyzes theoretical aspects of understanding corruption in public procurement, with particular emphasis on corruption definition problem, its nature, causes and consequences. A separate section is dedicated to the public procurement concept, mechanism and legal framework. In the first part of this work the author presents cognitive methodology of corruption in public procurement field, based on which the author has carried out an analysis of corruption situation in public procurement in Republic of Latvia. In the second chapter of the thesis, the author analyzes the problem of corruption in public procurement, including its historical aspects, typology and classification of corruption subjects involved, corruption risk elements in public procurement and their identification. During the development of the second chapter author's practical experience in public procurements was widely used. The third and fourth chapter deals with issues related to the prevention and combating corruption in public procurement, namely the operation of the concept, principles, methods and techniques, subjects in Republic of Latvia, as well as an analysis of foreign experience in preventing and combating corruption. The fifth chapter is devoted to the corruption prevention and combating perspectives and their assessment. In this chapter the author has made the evaluation of corruption prevention and combating measures efficiency in Republic of Latvia, assessment of anti-corruption legislation development stage in public procurement field in Latvia.
Keywords: Prevention of corruption, public purchases.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188781 Screening and Evaluation of in vivo and in vitro Generated Insulin Plant (Vernonia divergens) for Antimicrobial and Anticancer Activities
Authors: Santosh Kumar, Anand Prakash, Kanak Sinha, Anita K Verma
Abstract:
Vernonia divergens Benth., commonly known as “Insulin Plant” (Fam: Asteraceae) is a potent sugar killer. Locally the leaves of the plant, boiled in water are successfully administered to a large number of diabetic patients. The present study evaluates the putative anti-diabetic ingredients, isolated from the in vivo and in vitro grown plantlets of V. divergens for their antimicrobial and anticancer activities. Sterilized explants of nodal segments were cultured on MS (Musashige and Skoog, 1962) medium in presence of different combinations of hormones. Multiple shoots along with bunch of roots were regenerated at 1mg l-1 BAP and 0.5 mg l-1 NAA. Micro-plantlets were separated and sub-cultured on the double strength (2X) of the above combination of hormones leading to increased length of roots and shoots. These plantlets were successfully transferred to soil and survived well in nature. The ethanol extract of plantlets from both in vivo & in vitro sources were prepared in soxhlet extractor and then concentrated to dryness under reduced pressure in rotary evaporator. Thus obtainedconcentrated extracts showed significant inhibitory activity against gram negative bacteria like Escherichia coli and Pseudomonas aeruginosa but no inhibition was found against gram positive bacteria. Further, these ethanol extracts were screened for in vitro percentage cytotoxicity at different time periods (24 h, 48 h and 72 h) of different dilutions. The in vivo plant extract inhibited the growth of EAC mouse cell lines in the range of 65, 66, 78, and 88% at 100, 50, 25 & 12.5μg mL-1 but at 72 h of treatment. In case of the extract of in vitro origin, the inhibition was found against EAC cell lines even at 48h. During spectrophotometric scanning, the extracts exhibited different maxima (ʎ) - four peaks in in vitro extracts as against single in in vivo preparation suggesting the possible change in the nature of ingredients during micropropagation through tissue culture techniques.Keywords: Anti-cancer, Anti-microbial, EAC mouse cell, Tissue culture, Vernonia divergens.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 236780 Automated Transformation of 3D Point Cloud to Building Information Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Petar Penchev
Abstract:
The digital era has revolutionized architectural practices, with Building Information Modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research presents a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data — a collection of data points in space, typically produced by 3D scanners — into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historical preservation.
Keywords: Algorithmic modeling, Building Information Modeling, point cloud, reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1779 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications
Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso
Abstract:
The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.
Keywords: Interferometry, MIMO RADAR, SAR, tomography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 91178 Experimental Investigation of the Impact of Biosurfactants on Residual-Oil Recovery
Authors: S. V. Ukwungwu, A. J. Abbas, G. G. Nasr
Abstract:
The increasing high price of natural gas and oil with attendant increase in energy demand on world markets in recent years has stimulated interest in recovering residual oil saturation across the globe. In order to meet the energy security, efforts have been made in developing new technologies of enhancing the recovery of oil and gas, utilizing techniques like CO2 flooding, water injection, hydraulic fracturing, surfactant flooding etc. Surfactant flooding however optimizes production but poses risk to the environment due to their toxic nature. Amongst proven records that have utilized other type of bacterial in producing biosurfactants for enhancing oil recovery, this research uses a technique to combine biosurfactants that will achieve a scale of EOR through lowering interfacial tension/contact angle. In this study, three biosurfactants were produced from three Bacillus species from freeze dried cultures using sucrose 3 % (w/v) as their carbon source. Two of these produced biosurfactants were screened with the TEMCO Pendant Drop Image Analysis for reduction in IFT and contact angle. Interfacial tension was greatly reduced from 56.95 mN.m-1 to 1.41 mN.m-1 when biosurfactants in cell-free culture (Bacillus licheniformis) were used compared to 4. 83mN.m-1 cell-free culture of Bacillus subtilis. As a result, cell-free culture of (Bacillus licheniformis) changes the wettability of the biosurfactant treatment for contact angle measurement to more water-wet as the angle decreased from 130.75o to 65.17o. The influence of microbial treatment on crushed rock samples was also observed by qualitative wettability experiments. Treated samples with biosurfactants remained in the aqueous phase, indicating a water-wet system. These results could prove that biosurfactants can effectively change the chemistry of the wetting conditions against diverse surfaces, providing a desirable condition for efficient oil transport in this way serving as a mechanism for EOR. The environmental friendly effect of biosurfactants applications for industrial purposes play important advantages over chemically synthesized surfactants, with various possible structures, low toxicity, eco-friendly and biodegradability.Keywords: Bacillus, biosurfactant, enhanced oil recovery, residual oil, wettability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 149877 Analysis of Structural and Photocatalytical Properties of Anatase, Rutile and Mixed Phase TiO2 Films Deposited by Pulsed-Direct Current and Radio Frequency Magnetron Co-Sputtering
Authors: S. Varnagiris, M. Urbonavicius, S. Tuckute, M. Lelis, K. Bockute
Abstract:
Amongst many water purification techniques, TiO2 photocatalysis is recognized as one of the most promising sustainable methods. It is known that for photocatalytical applications anatase is the most suitable TiO2 phase, however heterojunction of anatase/rutile phases could improve the photocatalytical activity of TiO2 even further. Despite the relative simplicity of TiO2 different synthesis methods lead to the highly dispersed crystal phases and photocatalytic activity of the corresponding samples. Accordingly, suggestions and investigations of various innovative methods of TiO2 synthesis are still needed. In this work structural and photocatalytical properties of TiO2 films deposited by the unconventional method of simultaneous co-sputtering from two magnetrons powered by pulsed-Direct Current (pDC) and Radio Frequency (RF) power sources with negative bias voltage have been studied. More specifically, TiO2 film thickness, microstructure, surface roughness, crystal structure, optical transmittance and photocatalytical properties were investigated by profilometer, scanning electron microscope, atomic force microscope, X-ray diffractometer and UV-Vis spectrophotometer respectively. The proposed unconventional two magnetron co-sputtering based TiO2 film formation method showed very promising results for crystalline TiO2 film formation while keeping process temperatures below 100 °C. XRD analysis revealed that by using proper combination of power source type and bias voltage various TiO2 phases (amorphous, anatase, rutile or their mixture) can be synthesized selectively. Moreover, strong dependency between power source type and surface roughness, as well as between the bias voltage and band gap value of TiO2 films was observed. Interestingly, TiO2 films deposited by two magnetron co-sputtering without bias voltage had one of the highest band gap values between the investigated films but its photocatalytic activity was superior compared to all other samples. It is suggested that this is due to the dominating nanocrystalline anatase phase with various exposed surfaces including photocatalytically the most active {001}.
Keywords: Films, magnetron co-sputtering, photocatalysis, TiO2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 65176 Reinforced Concrete Bridge Deck Condition Assessment Methods Using Ground Penetrating Radar and Infrared Thermography
Authors: Nicole M. Martino
Abstract:
Reinforced concrete bridge deck condition assessments primarily use visual inspection methods, where an inspector looks for and records locations of cracks, potholes, efflorescence and other signs of probable deterioration. Sounding is another technique used to diagnose the condition of a bridge deck, however this method listens for damage within the subsurface as the surface is struck with a hammer or chain. Even though extensive procedures are in place for using these inspection techniques, neither one provides the inspector with a comprehensive understanding of the internal condition of a bridge deck – the location where damage originates from. In order to make accurate estimates of repair locations and quantities, in addition to allocating the necessary funding, a total understanding of the deck’s deteriorated state is key. The research presented in this paper collected infrared thermography and ground penetrating radar data from reinforced concrete bridge decks without an asphalt overlay. These decks were of various ages and their condition varied from brand new, to in need of replacement. The goals of this work were to first verify that these nondestructive evaluation methods could identify similar areas of healthy and damaged concrete, and then to see if combining the results of both methods would provide a higher confidence than if the condition assessment was completed using only one method. The results from each method were presented as plan view color contour plots. The results from one of the decks assessed as a part of this research, including these plan view plots, are presented in this paper. Furthermore, in order to answer the interest of transportation agencies throughout the United States, this research developed a step-by-step guide which demonstrates how to collect and assess a bridge deck using these nondestructive evaluation methods. This guide addresses setup procedures on the deck during the day of data collection, system setups and settings for different bridge decks, data post-processing for each method, and data visualization and quantification.
Keywords: Bridge deck deterioration, ground penetrating radar, infrared thermography, NDT of bridge decks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 91375 An AI-Generated Semantic Communication Platform in Human-Computer Interaction Course
Authors: Yi Yang, Jiasong Sun
Abstract:
Almost every aspect of our daily lives is now intertwined with some degree of Human-Computer Interaction (HCI). HCI courses draw on knowledge from disciplines as diverse as computer science, psychology, design principles, anthropology and more. The HCI courses in the Department of Electronics at Tsinghua University, known as the Media and Cognition course, is constantly updated to reflect the most advanced technological advances, such as virtual reality, augmented reality and artificial intelligence-based interaction. For more than a decade, this course has used an interest-based approach to teaching, in which students proactively propose some research-based questions and collaborate with teachers, using course knowledge to explore potential solutions. Semantic communication plays a key role in facilitating understanding and interaction between users and computer systems, ultimately enhancing system usability and user experience. The advancements in AI-generated technology, which has gained significant attention from both academia and industry in recent years, are exemplified by language models like GPT-3 that generate human-like dialogues from given prompts. The latest version of the HCI course practices a semantic communication platform based on AI-generated techniques. We explored a student-centered model and proposed an interest-based teaching method. Students are no longer just recipients of knowledge, but become active participants in the learning process driven by personal interests, thereby encouraging students to take responsibility for their own education. One of the latest results of this teaching approach in the course "Media and Cognition" is a student proposal to develop a semantic communication platform rooted in artificial intelligence generative technologies. The platform solves a key challenge in communications technology: the ability to preserve visual signals. The interest-based approach emphasizes personal curiosity and active participation, and the proposal of an artificial intelligence-generated semantic communication platform is an example and successful result of how students can exert greater creativity when they have the power to control their own learning.
Keywords: Human-computer interaction, media and cognition course, semantic communication, retain ability, prompts.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16474 On-Line Geometrical Identification of Reconfigurable Machine Tool using Virtual Machining
Authors: Alexandru Epureanu, Virgil Teodor
Abstract:
One of the main research directions in CAD/CAM machining area is the reducing of machining time. The feedrate scheduling is one of the advanced techniques that allows keeping constant the uncut chip area and as sequel to keep constant the main cutting force. They are two main ways for feedrate optimization. The first consists in the cutting force monitoring, which presumes to use complex equipment for the force measurement and after this, to set the feedrate regarding the cutting force variation. The second way is to optimize the feedrate by keeping constant the material removal rate regarding the cutting conditions. In this paper there is proposed a new approach using an extended database that replaces the system model. The feedrate scheduling is determined based on the identification of the reconfigurable machine tool, and the feed value determination regarding the uncut chip section area, the contact length between tool and blank and also regarding the geometrical roughness. The first stage consists in the blank and tool monitoring for the determination of actual profiles. The next stage is the determination of programmed tool path that allows obtaining the piece target profile. The graphic representation environment models the tool and blank regions and, after this, the tool model is positioned regarding the blank model according to the programmed tool path. For each of these positions the geometrical roughness value, the uncut chip area and the contact length between tool and blank are calculated. Each of these parameters are compared with the admissible values and according to the result the feed value is established. We can consider that this approach has the following advantages: in case of complex cutting processes the prediction of cutting force is possible; there is considered the real cutting profile which has deviations from the theoretical profile; the blank-tool contact length limitation is possible; it is possible to correct the programmed tool path so that the target profile can be obtained. Applying this method, there are obtained data sets which allow the feedrate scheduling so that the uncut chip area is constant and, as a result, the cutting force is constant, which allows to use more efficiently the machine tool and to obtain the reduction of machining time.Keywords: Reconfigurable machine tool, system identification, uncut chip area, cutting conditions scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 144973 Microwave-Assisted Alginate Extraction from Portuguese Saccorhiza polyschides – Influence of Acid Pretreatment
Authors: Mário Silva, Filipa Gomes, Filipa Oliveira, Simone Morais, Cristina Delerue-Matos
Abstract:
Brown seaweeds are abundant in Portuguese coastline and represent an almost unexploited marine economic resource. One of the most common species, easily available for harvesting in the northwest coast, is Saccorhiza polyschides grows in the lowest shore and costal rocky reefs. It is almost exclusively used by local farmers as natural fertilizer, but contains a substantial amount of valuable compounds, particularly alginates, natural biopolymers of high interest for many industrial applications. Alginates are natural polysaccharides present in cell walls of brown seaweed, highly biocompatible, with particular properties that make them of high interest for the food, biotechnology, cosmetics and pharmaceutical industries. Conventional extraction processes are based on thermal treatment. They are lengthy and consume high amounts of energy and solvents. In recent years, microwave-assisted extraction (MAE) has shown enormous potential to overcome major drawbacks that outcome from conventional plant material extraction (thermal and/or solvent based) techniques, being also successfully applied to the extraction of agar, fucoidans and alginates. In the present study, acid pretreatment of brown seaweed Saccorhiza polyschides for subsequent microwave-assisted extraction (MAE) of alginate was optimized. Seaweeds were collected in Northwest Portuguese coastal waters of the Atlantic Ocean between May and August, 2014. Experimental design was used to assess the effect of temperature and acid pretreatment time in alginate extraction. Response surface methodology allowed the determination of the optimum MAE conditions: 40 mL of HCl 0.1 M per g of dried seaweed with constant stirring at 20ºC during 14h. Optimal acid pretreatment conditions have enhanced significantly MAE of alginates from Saccorhiza polyschides, thus contributing for the development of a viable, more environmental friendly alternative to conventional processes.
Keywords: Acid pretreatment, Alginate, Brown seaweed, Microwave-assisted extraction, Response surface methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 334472 Factors Militating the Organization of Intramural Sport Programs in Secondary Schools: A Case Study of the Ekiti West Local Government Area of Ekiti State, Nigeria
Authors: Adewole Taiwo Adelabu
Abstract:
The study investigated the factors militating the organization of intramural sports programs in secondary schools in Ekiti State, Nigeria. The purpose of the study was to identify the factors affecting the organization of sports in secondary schools and also to proffer possible solutions to these factors. The study employed the inferential statistics of chi-square (x2). Five research hypotheses were formulated. The population for the study was all the students in the government-owned secondary schools in Ekiti West Local Government of Ekiti State Nigeria. The sample for the study was 60 students in three schools within the local government selected through simple random sampling techniques. The instrument used for the study was a self-developed questionnaire by the researcher for data collection. The instrument was presented to experts and academicians in the field of Human Kinetics and Health Education for construct and content validation. A reliability test was conducted which involves 10 students who are not part of the study. The test-retest coefficient of 0.74 was obtained which attested to the fact that the instrument was reliable enough for the study. The validated questionnaire was administered to the students in their various schools by the researcher with the help of two research assistants; the questionnaires were filled and returned to the researcher immediately. The data collected were analyzed using the descriptive statistics of frequency count, percentage and mean to analyze demographic data in section A of the questionnaire, while inferential statistics of chi-square was used to test the hypotheses at 0.05 alpha level. The results of the study revealed that personnel, fund, schedule (time) were significant factors that affect the organization of intramural sport programs among students in secondary schools in Ekiti West Local Government Area of the State. The study also revealed that organization of intramural sports programs among students of secondary schools will improve and motivate students’ participation in sports beyond the local level. However, facilities and equipment is not a significant factor affecting the organization of intramural sports among secondary school students in Ekiti West Local Government Area.
Keywords: Challenge, militating, intramural sport, programs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 92171 The Physiological Impacts of Genital Weightlifting Conditioning: Exploring Iron Crotch Practice for Enhanced Sexual Function, Premature Ejaculation, Penile Dysfunction, Impotence, Hormonal Balance, and Prostate Health
Authors: C. Ardil
Abstract:
This study explores "Iron Crotch Kung Fu," a unique practice involving genital weightlifting. While the practice has historical significance, its potential health benefits, particularly in sexual function and overall well-being, remain largely anecdotal. To bridge the gap between tradition and modern science, this study proposes a modified Iron Crotch training program integrating principles from Pelvic Floor Muscle Training (PFMT). This integrated approach offers a safer and more effective pathway to harness the potential benefits of Iron Crotch, including enhanced sexual function, improved pelvic floor health, and increased core strength. The study delves into the historical context, technical methodologies, and potential physiological impacts of Iron Crotch, while highlighting the importance of careful practice under expert guidance. By integrating historical context, practical techniques, and scientific insights, this study aims to provide a balanced perspective on Iron Crotch and its potential role in modern health and wellness practices.
Keywords: Iron Crotch, iron crotch kung fu, Diao Gung, genital weightlifting, back pain, erectile dysfunction, exercise, exercise therapy, female athletes, hormonal balance, hypertonicity, martial arts, meta-analysis, overactivity, pelvic floor, pelvic floor disorders, pelvic floor muscle dysfunction, pelvic floor muscle training, pelvic floor physical therapy, penile dysfunction, physical health, physical medicine, physiotherapy, premature ejaculation, prostate health, provoked vestibulodynia, resistance training, sexual dysfunction, sexual health, sexual medicine, sexual orientation, systematic review, traditional health practices, urinary incontinence, urodynamics, vaginismus, vestibulodynia, women’s health, PFMT
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3870 Structural Parsing of Natural Language Text in Tamil Using Phrase Structure Hybrid Language Model
Authors: Selvam M, Natarajan. A M, Thangarajan R
Abstract:
Parsing is important in Linguistics and Natural Language Processing to understand the syntax and semantics of a natural language grammar. Parsing natural language text is challenging because of the problems like ambiguity and inefficiency. Also the interpretation of natural language text depends on context based techniques. A probabilistic component is essential to resolve ambiguity in both syntax and semantics thereby increasing accuracy and efficiency of the parser. Tamil language has some inherent features which are more challenging. In order to obtain the solutions, lexicalized and statistical approach is to be applied in the parsing with the aid of a language model. Statistical models mainly focus on semantics of the language which are suitable for large vocabulary tasks where as structural methods focus on syntax which models small vocabulary tasks. A statistical language model based on Trigram for Tamil language with medium vocabulary of 5000 words has been built. Though statistical parsing gives better performance through tri-gram probabilities and large vocabulary size, it has some disadvantages like focus on semantics rather than syntax, lack of support in free ordering of words and long term relationship. To overcome the disadvantages a structural component is to be incorporated in statistical language models which leads to the implementation of hybrid language models. This paper has attempted to build phrase structured hybrid language model which resolves above mentioned disadvantages. In the development of hybrid language model, new part of speech tag set for Tamil language has been developed with more than 500 tags which have the wider coverage. A phrase structured Treebank has been developed with 326 Tamil sentences which covers more than 5000 words. A hybrid language model has been trained with the phrase structured Treebank using immediate head parsing technique. Lexicalized and statistical parser which employs this hybrid language model and immediate head parsing technique gives better results than pure grammar and trigram based model.Keywords: Hybrid Language Model, Immediate Head Parsing, Lexicalized and Statistical Parsing, Natural Language Processing, Parts of Speech, Probabilistic Context Free Grammar, Tamil Language, Tree Bank.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 364369 A Risk Assessment Tool for the Contamination of Aflatoxins on Dried Figs based on Machine Learning Algorithms
Authors: Kottaridi Klimentia, Demopoulos Vasilis, Sidiropoulos Anastasios, Ihara Diego, Nikolaidis Vasileios, Antonopoulos Dimitrios
Abstract:
Aflatoxins are highly poisonous and carcinogenic compounds produced by species of the genus Aspergillus spp. that can infect a variety of agricultural foods, including dried figs. Biological and environmental factors, such as population, pathogenicity and aflatoxinogenic capacity of the strains, topography, soil and climate parameters of the fig orchards are believed to have a strong effect on aflatoxin levels. Existing methods for aflatoxin detection and measurement, such as high-performance liquid chromatography (HPLC), and enzyme-linked immunosorbent assay (ELISA), can provide accurate results, but the procedures are usually time-consuming, sample-destructive and expensive. Predicting aflatoxin levels prior to crop harvest is useful for minimizing the health and financial impact of a contaminated crop. Consequently, there is interest in developing a tool that predicts aflatoxin levels based on topography and soil analysis data of fig orchards. This paper describes the development of a risk assessment tool for the contamination of aflatoxin on dried figs, based on the location and altitude of the fig orchards, the population of the fungus Aspergillus spp. in the soil, and soil parameters such as pH, saturation percentage (SP), electrical conductivity (EC), organic matter, particle size analysis (sand, silt, clay), concentration of the exchangeable cations (Ca, Mg, K, Na), extractable P and trace of elements (B, Fe, Mn, Zn and Cu), by employing machine learning methods. In particular, our proposed method integrates three machine learning techniques i.e., dimensionality reduction on the original dataset (Principal Component Analysis), metric learning (Mahalanobis Metric for Clustering) and K-nearest Neighbors learning algorithm (KNN), into an enhanced model, with mean performance equal to 85% by terms of the Pearson Correlation Coefficient (PCC) between observed and predicted values.
Keywords: aflatoxins, Aspergillus spp., dried figs, k-nearest neighbors, machine learning, prediction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 64868 Brazilian Constitution and the Fundamental Right to Sanitation
Authors: Michely Vargas Delpupo, José Geraldo Romanello Bueno
Abstract:
The right to basic sanitation, was elevated to the category of fundamental right by the Constitution of 1988 to protect the ecologically balanced environment, ensuring social rights to health and adequate housing and put the dignity of the human person as the foundation of the Brazilian Democratic State. Before their essentiality to humans, this article seeks to understand why universal access to basic sanitation is a goal so difficult to achieve in Brazil. Therefore, this research uses the deductive and analytical method. Given the nature of the research literature, research techniques were centered in specialized books on the subject, journals, theses and dissertations, laws, relevant law case and raising social indicators relating to the theme. The relevance of the topic stems, among other things, the fact that sanitation services are essential for a dignified life, i.e., everyone is entitled to the maintenance of the necessary existence conditions are satisfied. However, the effectiveness of this right is undermined in society, since Brazil has huge deficit in sanitation services, denying thus a worthy life to most of the population. Thus, it can be seen that the provision of water and sewage services in Brazil is still characterized by a large imbalance, since the municipalities with lower population index have greater disability in the sanitation service. The truth is that the precariousness of water and sewage services in Brazil is still very concentrated in the North and Northeast regions, limiting the effective implementation of the Law 11.445/2007 in the country. Therefore, there is urgent need for a positive service by the State in the provision of sanitation services in order to prevent and control disease, improve quality of life and productivity of individuals, besides preventing contamination of water resources. More than just social and economic necessity, there is a government duty to implement such services. In this sense, given the current scenario, to achieve universal access to basic sanitation imposes many hurdles. These are mainly in the field of properly formulated and implemented public policies, i.e., it requires an excellent institutional organization, management services, strategic planning, social control, in order to provide answers to complex challenges.Keywords: Fundamental rights, sanitation, universal access.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 163167 Modelling Forest Fire Risk in the Goaso Forest Area of Ghana: Remote Sensing and Geographic Information Systems Approach
Authors: Bernard Kumi-Boateng, Issaka Yakubu
Abstract:
Forest fire, which is, an uncontrolled fire occurring in nature has become a major concern for the Forestry Commission of Ghana (FCG). The forest fires in Ghana usually result in massive destruction and take a long time for the firefighting crews to gain control over the situation. In order to assess the effect of forest fire at local scale, it is important to consider the role fire plays in vegetation composition, biodiversity, soil erosion, and the hydrological cycle. The occurrence, frequency and behaviour of forest fires vary over time and space, primarily as a result of the complicated influences of changes in land use, vegetation composition, fire suppression efforts, and other indigenous factors. One of the forest zones in Ghana with a high level of vegetation stress is the Goaso forest area. The area has experienced changes in its traditional land use such as hunting, charcoal production, inefficient logging practices and rural abandonment patterns. These factors which were identified as major causes of forest fire, have recently modified the incidence of fire in the Goaso area. In spite of the incidence of forest fires in the Goaso forest area, most of the forest services do not provide a cartographic representation of the burned areas. This has resulted in significant amount of information being required by the firefighting unit of the FCG to understand fire risk factors and its spatial effects. This study uses Remote Sensing and Geographic Information System techniques to develop a fire risk hazard model using the Goaso Forest Area (GFA) as a case study. From the results of the study, natural forest, agricultural lands and plantation cover types were identified as the major fuel contributing loads. However, water bodies, roads and settlements were identified as minor fuel contributing loads. Based on the major and minor fuel contributing loads, a forest fire risk hazard model with a reasonable accuracy has been developed for the GFA to assist decision making.
Keywords: Forest risk, GIS, remote sensing, Goaso.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 199866 Tactile Sensory Digit Feedback for Cochlear Implant Electrode Insertion
Authors: Yusuf Bulale, Mark Prince, Geoff Tansley, Peter Brett
Abstract:
Cochlear Implantation (CI) which became a routine procedure for the last decades is an electronic device that provides a sense of sound for patients who are severely and profoundly deaf. The optimal success of this implantation depends on the electrode technology and deep insertion techniques. However, this manual insertion procedure may cause mechanical trauma which can lead to severe destruction of the delicate intracochlear structure. Accordingly, future improvement of the cochlear electrode implant insertion needs reduction of the excessive force application during the cochlear implantation which causes tissue damage and trauma. This study is examined tool-tissue interaction of large prototype scale digit embedded with distributive tactile sensor based upon cochlear electrode and large prototype scale cochlea phantom for simulating the human cochlear which could lead to small scale digit requirements. The digit, distributive tactile sensors embedded with silicon-substrate was inserted into the cochlea phantom to measure any digit/phantom interaction and position of the digit in order to minimize tissue and trauma damage during the electrode cochlear insertion. The digit have provided tactile information from the digitphantom insertion interaction such as contact status, tip penetration, obstacles, relative shape and location, contact orientation and multiple contacts. The tests demonstrated that even devices of such a relative simple design with low cost have potential to improve cochlear implant surgery and other lumen mapping applications by providing tactile sensory feedback information and thus controlling the insertion through sensing and control of the tip of the implant during the insertion. In that approach, the surgeon could minimize the tissue damage and potential damage to the delicate structures within the cochlear caused by current manual electrode insertion of the cochlear implantation. This approach also can be applied to other minimally invasive surgery applications as well as diagnosis and path navigation procedures.Keywords: Cochlear electrode insertion, distributive tactile sensory feedback information, flexible digit, minimally invasive surgery, tool/tissue interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 217965 Simulation and Parameterization by the Finite Element Method of a C Shape Delectromagnet for Application in the Characterization of Magnetic Properties of Materials
Authors: A. A Velásquez, J.Baena
Abstract:
This article presents the simulation, parameterization and optimization of an electromagnet with the C–shaped configuration, intended for the study of magnetic properties of materials. The electromagnet studied consists of a C-shaped yoke, which provides self–shielding for minimizing losses of magnetic flux density, two poles of high magnetic permeability and power coils wound on the poles. The main physical variable studied was the static magnetic flux density in a column within the gap between the poles, with 4cm2 of square cross section and a length of 5cm, seeking a suitable set of parameters that allow us to achieve a uniform magnetic flux density of 1x104 Gaussor values above this in the column, when the system operates at room temperature and with a current consumption not exceeding 5A. By means of a magnetostatic analysis by the finite element method, the magnetic flux density and the distribution of the magnetic field lines were visualized and quantified. From the results obtained by simulating an initial configuration of electromagnet, a structural optimization of the geometry of the adjustable caps for the ends of the poles was performed. The magnetic permeability effect of the soft magnetic materials used in the poles system, such as low– carbon steel (0.08% C), Permalloy (45% Ni, 54.7% Fe) and Mumetal (21.2% Fe, 78.5% Ni), was also evaluated. The intensity and uniformity of the magnetic field in the gap showed a high dependence with the factors described above. The magnetic field achieved in the column was uniform and its magnitude ranged between 1.5x104 Gauss and 1.9x104 Gauss according to the material of the pole used, with the possibility of increasing the magnetic field by choosing a suitable geometry of the cap, introducing a cooling system for the coils and adjusting the spacing between the poles. This makes the device a versatile and scalable tool to generate the magnetic field necessary to perform magnetic characterization of materials by techniques such as vibrating sample magnetometry (VSM), Hall-effect, Kerr-effect magnetometry, among others. Additionally, a CAD design of the modules of the electromagnet is presented in order to facilitate the construction and scaling of the physical device.
Keywords: Electromagnet, Finite Elements Method, Magnetostatic, Magnetometry, Modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 192964 Selection of Strategic Suppliers for Partnership: A Model with Two Stages Approach
Authors: Safak Isik, Ozalp Vayvay
Abstract:
Strategic partnerships with suppliers play a vital role for the long-term value-based supply chain. This strategic collaboration keeps still being one of the top priority of many business organizations in order to create more additional value; benefiting mainly from supplier’s specialization, capacity and innovative power, securing supply and better managing costs and quality. However, many organizations encounter difficulties in initiating, developing and managing those partnerships and many attempts result in failures. One of the reasons for such failure is the incompatibility of members of this partnership or in other words wrong supplier selection which emphasize the significance of the selection process since it is the beginning stage. An effective selection process of strategic suppliers is critical to the success of the partnership. Although there are several research studies to select the suppliers in literature, only a few of them is related to strategic supplier selection for long-term partnership. The purpose of this study is to propose a conceptual model for the selection of strategic partnership suppliers. A two-stage approach has been used in proposed model incorporating first segmentation and second selection. In the first stage; considering the fact that not all suppliers are strategically equal and instead of a long list of potential suppliers, Kraljic’s purchasing portfolio matrix can be used for segmentation. This supplier segmentation is the process of categorizing suppliers based on a defined set of criteria in order to identify types of suppliers and determine potential suppliers for strategic partnership. In the second stage, from a pool of potential suppliers defined at first phase, a comprehensive evaluation and selection can be performed to finally define strategic suppliers considering various tangible and intangible criteria. Since a long-term relationship with strategic suppliers is anticipated, criteria should consider both current and future status of the supplier. Based on an extensive literature review; strategical, operational and organizational criteria have been determined and elaborated. The result of the selection can also be used to determine suppliers who are not ready for a partnership but to be developed for strategic partnership. Since the model is based on multiple criteria for both stages, it provides a framework for further utilization of Multi-Criteria Decision Making (MCDM) techniques. The model may also be applied to a wide range of industries and involve managerial features in business organizations.
Keywords: Kraljic’s matrix, purchasing portfolio, strategic supplier selection, supplier collaboration, supplier partnership, supplier segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 115863 TheAnalyzer: Clustering-Based System for Improving Business Productivity by Analyzing User Profiles to Enhance Human-Computer Interaction
Authors: D. S. A. Nanayakkara, K. J. P. G. Perera
Abstract:
E-commerce platforms have revolutionized the shopping experience, offering convenient ways for consumers to make purchases. To improve interactions with customers and optimize marketing strategies, it is essential for businesses to understand user behavior, preferences, and needs on these platforms. This paper focuses on recommending businesses to customize interactions with users based on their behavioral patterns, leveraging data-driven analysis and machine learning techniques. Businesses can improve engagement and boost the adoption of e-commerce platforms by aligning behavioral patterns with user goals of usability and satisfaction. We propose TheAnalyzer, a clustering-based system designed to enhance business productivity by analyzing user-profiles and improving human-computer interaction. TheAnalyzer seamlessly integrates with business applications, collecting relevant data points based on users' natural interactions without additional burdens such as questionnaires or surveys. It defines five key user analytics as features for its dataset, which are easily captured through users' interactions with e-commerce platforms. This research presents a study demonstrating the successful distinction of users into specific groups based on the five key analytics considered by TheAnalyzer. With the assistance of domain experts, customized business rules can be attached to each group, enabling TheAnalyzer to influence business applications and provide an enhanced personalized user experience. The outcomes are evaluated quantitatively and qualitatively, demonstrating that utilizing TheAnalyzer’s capabilities can optimize business outcomes, enhance customer satisfaction, and drive sustainable growth. The findings of this research contribute to the advancement of personalized interactions in e-commerce platforms. By leveraging user behavioral patterns and analyzing both new and existing users, businesses can effectively tailor their interactions to improve customer satisfaction, loyalty and ultimately drive sales.
Keywords: Data clustering, data standardization, dimensionality reduction, human-computer interaction, user profiling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22862 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation
Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke
Abstract:
Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.Keywords: Automatic calibration framework, approximate Bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740