Search results for: optimization methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17889

Search results for: optimization methods

15849 Analysis of Landscape Pattern Evolution in Banan District, Chongqing, Based on GIS and FRAGSTATS

Authors: Wenyang Wan

Abstract:

The study of urban land use and landscape pattern is the current hotspot in the fields of planning and design, ecology, etc., which is of great significance for the construction of the overall humanistic ecosystem of the city and optimization of the urban spatial structure. Banan District, as the main part of the eastern eco-city planning of Chongqing Municipality, is a new high ground for highlighting the ecological characteristics of Chongqing, realizing effective transformation of ecological value, and promoting the integrated development of urban and rural areas. The analytical methods of land use transfer matrix (GIS) and landscape pattern index (Fragstats) were used to study the characteristics and laws of the evolution of land use landscape pattern in Banan District from 2000 to 2020, which provide some reference value for Banan District to alleviate the ecological contradiction of landscape. The results of the study show that: ① Banan District is rich in land use types, of which the area of cultivated land will still account for 57.15% of the total area of the landscape until 2020, accounting for an absolute advantage in the land use structure of Banan District; ② From 2000 to 2020, land use conversion in Banan District is characterized as: Cropland > woodland > grassland > shrubland > built-up land > water bodies > wetlands, with cropland converted to built-up land being the largest; ③ From 2000 to 2020, the landscape elements of Banan District were distributed in a balanced way, and the landscape types were rich and diversified, but due to the influence of human interference, it also presented the characteristics that the shape of the landscape elements tended to be irregular, and the dominant patches were distributed in a scattered manner, and the patches had poor connectivity. It is recommended that in future regional ecological construction, the layout should be rationally optimized, the relationship between landscape components should be coordinated, and the connectivity between landscape patches should be strengthened, and the degree of landscape fragmentation should be reduced.

Keywords: land use transfer, landscape pattern evolution, GIS and FRAGSTATS, Banan District

Procedia PDF Downloads 80
15848 Statistical and Land Planning Study of Tourist Arrivals in Greece during 2005-2016

Authors: Dimitra Alexiou

Abstract:

During the last 10 years, in spite of the economic crisis, the number of tourists arriving in Greece has increased, particularly during the tourist season from April to October. In this paper, the number of annual tourist arrivals is studied to explore their preferences with regard to the month of travel, the selected destinations, as well the amount of money spent. The collected data are processed with statistical methods, yielding numerical and graphical results. From the computation of statistical parameters and the forecasting with exponential smoothing, useful conclusions are arrived at that can be used by the Greek tourism authorities, as well as by tourist organizations, for planning purposes for the coming years. The results of this paper and the computed forecast can also be used for decision making by private tourist enterprises that are investing in Greece. With regard to the statistical methods, the method of Simple Exponential Smoothing of time series of data is employed. The search for a best forecast for 2017 and 2018 provides the value of the smoothing coefficient. For all statistical computations and graphics Microsoft Excel is used.

Keywords: tourism, statistical methods, exponential smoothing, land spatial planning, economy

Procedia PDF Downloads 265
15847 Estimating Estimators: An Empirical Comparison of Non-Invasive Analysis Methods

Authors: Yan Torres, Fernanda Simoes, Francisco Petrucci-Fonseca, Freddie-Jeanne Richard

Abstract:

The non-invasive samples are an alternative of collecting genetic samples directly. Non-invasive samples are collected without the manipulation of the animal (e.g., scats, feathers and hairs). Nevertheless, the use of non-invasive samples has some limitations. The main issue is degraded DNA, leading to poorer extraction efficiency and genotyping. Those errors delayed for some years a widespread use of non-invasive genetic information. Possibilities to limit genotyping errors can be done using analysis methods that can assimilate the errors and singularities of non-invasive samples. Genotype matching and population estimation algorithms can be highlighted as important analysis tools that have been adapted to deal with those errors. Although, this recent development of analysis methods there is still a lack of empirical performance comparison of them. A comparison of methods with dataset different in size and structure can be useful for future studies since non-invasive samples are a powerful tool for getting information specially for endangered and rare populations. To compare the analysis methods, four different datasets used were obtained from the Dryad digital repository were used. Three different matching algorithms (Cervus, Colony and Error Tolerant Likelihood Matching - ETLM) are used for matching genotypes and two different ones for population estimation (Capwire and BayesN). The three matching algorithms showed different patterns of results. The ETLM produced less number of unique individuals and recaptures. A similarity in the matched genotypes between Colony and Cervus was observed. That is not a surprise since the similarity between those methods on the likelihood pairwise and clustering algorithms. The matching of ETLM showed almost no similarity with the genotypes that were matched with the other methods. The different cluster algorithm system and error model of ETLM seems to lead to a more criterious selection, although the processing time and interface friendly of ETLM were the worst between the compared methods. The population estimators performed differently regarding the datasets. There was a consensus between the different estimators only for the one dataset. The BayesN showed higher and lower estimations when compared with Capwire. The BayesN does not consider the total number of recaptures like Capwire only the recapture events. So, this makes the estimator sensitive to data heterogeneity. Heterogeneity in the sense means different capture rates between individuals. In those examples, the tolerance for homogeneity seems to be crucial for BayesN work properly. Both methods are user-friendly and have reasonable processing time. An amplified analysis with simulated genotype data can clarify the sensibility of the algorithms. The present comparison of the matching methods indicates that Colony seems to be more appropriated for general use considering a time/interface/robustness balance. The heterogeneity of the recaptures affected strongly the BayesN estimations, leading to over and underestimations population numbers. Capwire is then advisable to general use since it performs better in a wide range of situations.

Keywords: algorithms, genetics, matching, population

Procedia PDF Downloads 143
15846 SynKit: A Event-Driven and Scalable Microservices-Based Kitting System

Authors: Bruno Nascimento, Cristina Wanzeller, Jorge Silva, João A. Dias, André Barbosa, José Ribeiro

Abstract:

The increasing complexity of logistics operations stems from evolving business needs, such as the shift from mass production to mass customization, which demands greater efficiency and flexibility. In response, Industry 4.0 and 5.0 technologies provide improved solutions to enhance operational agility and better meet market demands. The management of kitting zones, combined with the use of Autonomous Mobile Robots, faces challenges related to coordination, resource optimization, and rapid response to customer demand fluctuations. Additionally, implementing lean manufacturing practices in this context must be carefully orchestrated by intelligent systems and human operators to maximize efficiency without sacrificing the agility required in an advanced production environment. This paper proposes and implements a microservices-based architecture integrating principles from Industry 4.0 and 5.0 with lean manufacturing practices. The architecture enhances communication and coordination between autonomous vehicles and kitting management systems, allowing more efficient resource utilization and increased scalability. The proposed architecture focuses on the modularity and flexibility of operations, enabling seamless flexibility to change demands and the efficient allocation of resources in realtime. Conducting this approach is expected to significantly improve logistics operations’ efficiency and scalability by reducing waste and optimizing resource use while improving responsiveness to demand changes. The implementation of this architecture provides a robust foundation for the continuous evolution of kitting management and process optimization. It is designed to adapt to dynamic environments marked by rapid shifts in production demands and real-time decision-making. It also ensures seamless integration with automated systems, aligning with Industry 4.0 and 5.0 needs while reinforcing Lean Manufacturing principles.

Keywords: microservices, event-driven, kitting, AMR, lean manufacturing, industry 4.0, industry 5.0

Procedia PDF Downloads 23
15845 Bacteriological Culture Methods and its Uses in Clinical Pathology

Authors: Prachi Choudhary, Jai Gopal Sharma

Abstract:

Microbial cultures determine the type of organism, its abundance in the tested sample, or both. It is one of the primary diagnostic methods of microbiology. It is used to determine the cause of infectious disease by letting the agent multiply in a predetermined medium. Different bacteria produce colonies that may be very distinct from the bacterial species that produced them. To culture any pathogen or microorganism, we should first know about the types of media used in microbiology for culturing. Sometimes sub culturing is also done in various microorganisms if some mixed growth is seen in culture. Nearly 3 types of culture media based on consistency – solid, semi-solid, and liquid (broth) media; are further explained in the report. Then, The Five I's approach is a method for locating, growing, observing, and characterizing microorganisms, including inoculation and incubation. Isolation, inspection, and identification. For identification of bacteria, we have to culture the sample like urine, sputum, blood, etc., on suitable media; there are different methods of culturing the bacteria or microbe like pour plate method, streak plate method, swabbing by needle, pipetting, inoculation by loop, spreading by spreader, etc. After this, we see the bacterial growth after incubation of 24 hours, then according to the growth of bacteria antibiotics susceptibility test is conducted; this is done for sensitive antibiotics or resistance to that bacteria, and also for knowing the name of bacteria. Various methods like the dilution method, disk diffusion method, E test, etc., do antibiotics susceptibility tests. After that, various medicines are provided to the patients according to antibiotic sensitivity and resistance.

Keywords: inoculation, incubation, isolation, antibiotics suspectibility test, characterizing

Procedia PDF Downloads 82
15844 The Unscented Kalman Filter Implementation for the Sensorless Speed Control of a Permanent Magnet Synchronous Motor

Authors: Justas Dilys

Abstract:

ThispaperaddressestheimplementationandoptimizationofanUnscentedKalmanFilter(UKF) for the Permanent Magnet Synchronous Motor (PMSM) sensorless control using an ARM Cortex- M3 microcontroller. A various optimization levels based on arithmetic calculation reduction was implemented in ARM Cortex-M3 microcontroller. The execution time of UKF estimator was up to 90µs without loss of accuracy. Moreover, simulation studies on the Unscented Kalman filters are carried out using Matlab to explore the usability of the UKF in a sensorless PMSMdrive.

Keywords: unscented kalman filter, ARM, PMSM, implementation

Procedia PDF Downloads 167
15843 Analysis of Non-Coding Genome in Streptococcus pneumoniae for Molecular Epidemiology Typing

Authors: Martynova Alina, Lyubov Buzoleva

Abstract:

Streptococcus pneumoniae is the causative agent of pneumonias and meningitids throught all the world. Having high genetic diversity, this microorganism can cause different clinical forms of pneumococcal infections and microbiologically it is really difficult diagnosed by routine methods. Also, epidemiological surveillance requires more developed methods of molecular typing because the recent method of serotyping doesn't allow to distinguish invasive and non-invasive isolates properly. Non-coding genome of bacteria seems to be the interesting source for seeking of highly distinguishable markers to discriminate the subspecies of such a variable bacteria as Streptococcus pneumoniae. Technically, we proposed scheme of discrimination of S.pneumoniae strains with amplification of non-coding region (SP_1932) with the following restriction with 2 types of enzymes of Alu1 and Mn1. Aim: This research aimed to compare different methods of typing and their application for molecular epidemiology purposes. Methods: we analyzed population of 100 strains of S.pneumoniae isolated from different patients by different molecular epidemiology methods such as pulse-field gel electophoresis (PFGE), restriction polymorphism analysis (RFLP) and multilolocus sequence typing (MLST), and all of them were compared with classic typing method as serotyping. The discriminative power was estimated with Simpson Index (SI). Results: We revealed that the most discriminative typing method is RFLP (SI=0,97, there were distinguished 42 genotypes).PFGE was slightly less discriminative (SI=0,95, we identified 35 genotypes). MLST is still the best reference method (SI=1.0). Classic method of serotyping showed quite weak discriminative power (SI=0,93, 24 genotypes). In addition, sensivity of RFLP was 100%, specificity was 97,09%. Conclusion: the most appropriate method for routine epidemiology surveillance is RFLP with non-coding region of Streptococcsu pneumoniae, then PFGE, though in some cases these results should be obligatory confirmed by MLST.

Keywords: molecular epidemiology typing, non-coding genome, Streptococcus pneumoniae, MLST

Procedia PDF Downloads 399
15842 Multilabel Classification with Neural Network Ensemble Method

Authors: Sezin Ekşioğlu

Abstract:

Multilabel classification has a huge importance for several applications, it is also a challenging research topic. It is a kind of supervised learning that contains binary targets. The distance between multilabel and binary classification is having more than one class in multilabel classification problems. Features can belong to one class or many classes. There exists a wide range of applications for multi label prediction such as image labeling, text categorization, gene functionality. Even though features are classified in many classes, they may not always be properly classified. There are many ensemble methods for the classification. However, most of the researchers have been concerned about better multilabel methods. Especially little ones focus on both efficiency of classifiers and pairwise relationships at the same time in order to implement better multilabel classification. In this paper, we worked on modified ensemble methods by getting benefit from k-Nearest Neighbors and neural network structure to address issues within a beneficial way and to get better impacts from the multilabel classification. Publicly available datasets (yeast, emotion, scene and birds) are performed to demonstrate the developed algorithm efficiency and the technique is measured by accuracy, F1 score and hamming loss metrics. Our algorithm boosts benchmarks for each datasets with different metrics.

Keywords: multilabel, classification, neural network, KNN

Procedia PDF Downloads 155
15841 Biological Optimization following BM-MSC Seeding of Partially Demineralized and Partially Demineralized Laser-Perforated Structural Bone Allografts Implanted in Critical Femoral Defects

Authors: S. AliReza Mirghasemi, Zameer Hussain, Mohammad Saleh Sadeghi, Narges Rahimi Gabaran, Mohamadreza Baghaban Eslaminejad

Abstract:

Background: Despite promising results have shown by osteogenic cell-based demineralized bone matrix composites, they need to be optimized for grafts that act as structural frameworks in load-bearing defects. The purpose of this experiment is to determine the effect of bone-marrow-mesenchymal-stem-cells seeding on partially demineralized laser-perforated structural allografts that have been implanted in critical femoral defects. Materials and Methods: P3 stem cells were used for graft seeding. Laser perforation in four rows of three holes was achieved. Cell-seeded grafts were incubated for one hour until they were planted into the defect. We used four types of grafts: partially demineralized only (Donly), partially demineralized stem cell seeded (DST), partially demineralized laser-perforated (DLP), and partially demineralized laser-perforated stem cell seeded (DLPST). histologic and histomorphometric analysis were performed at 12 weeks. Results: Partially demineralized laser-perforated had the highest woven bone formation within graft limits, stem cell seeded demineralized laser-perforated remained intact, and the difference between partially demineralized only and partially demineralized stem cell seeded was insignificant. At interface, partially demineralized laser-perforated and partially demineralized only had comparable osteogenesis, but partially demineralized stem cell seeded was inferior. The interface in stem cell seeded demineralized laser-perforated was almost replaced by distinct endochondral osteogenesis with higher angiogenesis in the vicinity. Partially demineralized stem cell seeded and stem cell seeded demineralized laser-perforated graft surfaces had extra vessel-ingrowth-like porosities, a sign of delayed resorption. Conclusion: This demonstrates that simple cell-based composites are not optimal and necessitates the supplementation of synergistic stipulations and surface changes.

Keywords: structural bone allograft, partial demineralization, laser perforation, mesenchymal stem cell

Procedia PDF Downloads 414
15840 The Optimization of Sexual Health Resource Information and Services for Persons with Spinal Cord Injury

Authors: Nasrin Nejatbakhsh, Anita Kaiser, Sander Hitzig, Colleen McGillivray

Abstract:

Following spinal cord injury (SCI), many individuals experience anxiety in adjusting to their lives, and its impacts on their sexuality. Research has demonstrated that regaining sexual function is a very high priority for individuals with SCI. Despite this, sexual health is one of the least likely areas of focus in rehabilitating individuals with SCI. There is currently a considerable gap in appropriate education and resources that address sexual health concerns and needs of people with spinal cord injury. Furthermore, the determinants of sexual health in individuals with SCI are poorly understood and thus poorly addressed. The purpose of this study was to improve current practices by informing a service delivery model that rehabilitation centers can adopt for appropriate delivery of their services. Methodology: We utilized qualitative methods in the form of a semi-structured interview containing open-ended questions to assess 1) sexual health concerns, 2) helpful strategies in current resources, 3) unhelpful strategies in current resources, and 4) Barriers to obtaining sexual health information. In addition to the interviews, participants completed surveys to identify socio-demographic factors. Data gathered was coded and evaluated for emerging themes and subthemes through a ‘code-recode’ technique. Results: We have identified several robust themes that are important for SCI sexual health resource development. Through analysis of these themes and their subthemes, several important concepts have emerged that could provide agencies with helpful strategies for providing sexual health resources. Some of the important considerations are that services be; anonymous, accessible, frequent, affordable, mandatory, casual and supported by peers. Implications: By incorporating the perspectives of individuals with SCI, the finding from this study can be used to develop appropriate sexual health services and improve access to information through tailored needs based program development.

Keywords: spinal cord injury, sexual health, determinants of health, resource development

Procedia PDF Downloads 251
15839 Probabilistic Building Life-Cycle Planning as a Strategy for Sustainability

Authors: Rui Calejo Rodrigues

Abstract:

Building Refurbishing and Maintenance is a major area of knowledge ultimately dispensed to user/occupant criteria. The optimization of the service life of a building needs a special background to be assessed as it is one of those concepts that needs proficiency to be implemented. ISO 15686-2 Buildings and constructed assets - Service life planning: Part 2, Service life prediction procedures, states a factorial method based on deterministic data for building components life span. Major consequences result on a deterministic approach because users/occupants are not sensible to understand the end of components life span and so simply act on deterministic periods and so costly and resources consuming solutions do not meet global targets of planet sustainability. The estimation of 2 thousand million conventional buildings in the world, if submitted to a probabilistic method for service life planning rather than a deterministic one provide an immense amount of resources savings. Since 1989 the research team nowadays stating for CEES–Center for Building in Service Studies developed a methodology based on Montecarlo method for probabilistic approach regarding life span of building components, cost and service life care time spans. The research question of this deals with the importance of probabilistic approach of buildings life planning compared with deterministic methods. It is presented the mathematic model developed for buildings probabilistic lifespan approach and experimental data is obtained to be compared with deterministic data. Assuming that buildings lifecycle depends a lot on component replacement this methodology allows to conclude on the global impact of fixed replacements methodologies such as those on result of deterministic models usage. Major conclusions based on conventional buildings estimate are presented and evaluated under a sustainable perspective.

Keywords: building components life cycle, building maintenance, building sustainability, Montecarlo Simulation

Procedia PDF Downloads 205
15838 Flicker Detection with Motion Tolerance for Embedded Camera

Authors: Jianrong Wu, Xuan Fu, Akihiro Higashi, Zhiming Tan

Abstract:

CMOS image sensors with a rolling shutter are used broadly in the digital cameras embedded in mobile devices. The rolling shutter suffers the flicker artifacts from the fluorescent lamp, and it could be observed easily. In this paper, the characteristics of illumination flicker in motion case were analyzed, and two efficient detection methods based on matching fragment selection were proposed. According to the experimental results, our methods could achieve as high as 100% accuracy in static scene, and at least 97% in motion scene.

Keywords: illumination flicker, embedded camera, rolling shutter, detection

Procedia PDF Downloads 420
15837 Impact Position Method Based on Distributed Structure Multi-Agent Coordination with JADE

Authors: YU Kaijun, Liang Dong, Zhang Yarong, Jin Zhenzhou, Yang Zhaobao

Abstract:

For the impact monitoring of distributed structures, the traditional positioning methods are based on the time difference, which includes the four-point arc positioning method and the triangulation positioning method. But in the actual operation, these two methods have errors. In this paper, the Multi-Agent Blackboard Coordination Principle is used to combine the two methods. Fusion steps: (1) The four-point arc locating agent calculates the initial point and records it to the Blackboard Module.(2) The triangulation agent gets its initial parameters by accessing the initial point.(3) The triangulation agent constantly accesses the blackboard module to update its initial parameters, and it also logs its calculated point into the blackboard.(4) When the subsequent calculation point and the initial calculation point are within the allowable error, the whole coordination fusion process is finished. This paper presents a Multi-Agent collaboration method whose agent framework is JADE. The JADE platform consists of several agent containers, with the agent running in each container. Because of the perfect management and debugging tools of the JADE, it is very convenient to deal with complex data in a large structure. Finally, based on the data in Jade, the results show that the impact location method based on Multi-Agent coordination fusion can reduce the error of the two methods.

Keywords: impact monitoring, structural health monitoring(SHM), multi-agent system(MAS), black-board coordination, JADE

Procedia PDF Downloads 178
15836 Optimization of Waste Plastic to Fuel Oil Plants' Deployment Using Mixed Integer Programming

Authors: David Muyise

Abstract:

Mixed Integer Programming (MIP) is an approach that involves the optimization of a range of decision variables in order to minimize or maximize a particular objective function. The main objective of this study was to apply the MIP approach to optimize the deployment of waste plastic to fuel oil processing plants in Uganda. The processing plants are meant to reduce plastic pollution by pyrolyzing the waste plastic into a cleaner fuel that can be used to power diesel/paraffin engines, so as (1) to reduce the negative environmental impacts associated with plastic pollution and also (2) to curb down the energy gap by utilizing the fuel oil. A programming model was established and tested in two case study applications that are, small-scale applications in rural towns and large-scale deployment across major cities in the country. In order to design the supply chain, optimal decisions on the types of waste plastic to be processed, size, location and number of plants, and downstream fuel applications were concurrently made based on the payback period, investor requirements for capital cost and production cost of fuel and electricity. The model comprises qualitative data gathered from waste plastic pickers at landfills and potential investors, and quantitative data obtained from primary research. It was found out from the study that a distributed system is suitable for small rural towns, whereas a decentralized system is only suitable for big cities. Small towns of Kalagi, Mukono, Ishaka, and Jinja were found to be the ideal locations for the deployment of distributed processing systems, whereas Kampala, Mbarara, and Gulu cities were found to be the ideal locations initially utilize the decentralized pyrolysis technology system. We conclude that the model findings will be most important to investors, engineers, plant developers, and municipalities interested in waste plastic to fuel processing in Uganda and elsewhere in developing economy.

Keywords: mixed integer programming, fuel oil plants, optimisation of waste plastics, plastic pollution, pyrolyzing

Procedia PDF Downloads 129
15835 Optimal Utilization of Space in a Warehouse: A Case Study

Authors: Arun Kumar R. K. Gothra, Hasan Alhakamy

Abstract:

With increasing expectations and demands for warehousing and distribution, Warehouse Solution Incorporated in Victoria has been looking at ways to improve on its business processes to maintain the competitive edge. To maintain the provision of high quality service standards at competitive and affordable prices, improvements in the logistics management are necessary. One such avenue is to make efficient use of space available in the warehouse. This paper is based on a study of the collaboration of Warehouse Solution Inc with Dandenong Distribution Centre (DDC) to solve congestion problem and enhance efficiency of the whole warehouse activities.

Keywords: space optimization, optimal utilization, warehouse, DDC

Procedia PDF Downloads 610
15834 Early Hypothyroidism after Radiotherapy for Nasopharyngeal Carcinoma

Authors: Nejla Fourati, Zied Fessi, Fatma Dhouib, Wicem Siala, Leila Farhat, Afef Khanfir, Wafa Mnejja, Jamel Daoud

Abstract:

Purpose: Radiation induced hypothyroidism in nasopharyngeal cancer (NPC) ranged from 15% to 55%. In reported data, it is considered as a common late complication of definitive radiation and is mainly observed 2 years after the end of treatment. The aim of this study was to evaluate the incidence of early hypothyroidism within 6 months after radiotherapy. Patients and methods: From June 2017 to February 2020, 35 patients treated with concurrent chemo-radiotherapy (CCR) for NPC were included in this prospective study. Median age was 49 years [23-68] with a sex ratio of 2.88. All patients received intensity modulated radiotherapy (IMRT) at a dose of 69.96 Gy in 33 daily fractions with weekly cisplatin (40mg/m²) chemotherapy. Thyroid stimulating hormone (TSH) and Free Thyroxine 4 (FT4) dosage was performed before the start of radiotherapy and 6 months after. Different dosimetric parameters for the thyroid gland were reported: the volume (cc); the mean dose (Dmean) and the %age of volume receiving more than 45 Gy (V45Gy). Wilcoxon Test was used to compare these different parameters between patients with or without hypothyroidism. Results: At baseline, 5 patients (14.3%) had hypothyroidism and were excluded from the analysis. For the remaining 30 patients, 9 patients (30%) developed a hypothyroidism 6 months after the end of radiotherapy. The median thyroid volume was 10.3 cc [4.6-23]. The median Dmean and V45Gy were 48.3 Gy [43.15-55.4] and 74.8 [38.2-97.9] respectively. No significant difference was noted for all studied parameters. Conclusion: Early hypothyroidism occurring within 6 months after CCR for NPC seems to be a common complication (30%) that should be screened. Good patient monitoring with regular dosage of TSH and FT4 makes it possible to treat hypothyroidism in asymptomatic phase. This would be correlated with an improvement in the quality of life of these patients. The results of our study do not show a correlation between the thyroid doses and the occurrence of hypothyroidism. This is probably related to the high doses received by the thyroid in our series. These findings encourage more optimization to limit thyroid doses and then the risk of radiation-induced hypothyroidism

Keywords: nasopharyngeal carcinoma, hypothyroidism, early complication, thyroid dose

Procedia PDF Downloads 131
15833 Modification of Underwood's Equation to Calculate Minimum Reflux Ratio for Column with One Side Stream Upper Than Feed

Authors: S. Mousavian, A. Abedianpour, A. Khanmohammadi, S. Hematian, Gh. Eidi Veisi

Abstract:

Distillation is one of the most important and utilized separation methods in the industrial practice. There are different ways to design of distillation column. One of these ways is short cut method. In short cut method, material balance and equilibrium are employed to calculate number of tray in distillation column. There are different methods that are classified in short cut method. One of these methods is Fenske-Underwood-Gilliland method. In this method, minimum reflux ratio should be calculated by underwood equation. Underwood proposed an equation that is useful for simple distillation column with one feed and one top and bottom product. In this study, underwood method is developed to predict minimum reflux ratio for column with one side stream upper than feed. The result of this model compared with McCabe-Thiele method. The result shows that proposed method able to calculate minimum reflux ratio with very small error.

Keywords: minimum reflux ratio, side stream, distillation, Underwood’s method

Procedia PDF Downloads 406
15832 Analysis of the Evolution of Landscape Spatial Patterns in Banan District, Chongqing, China

Authors: Wenyang Wan

Abstract:

The study of urban land use and landscape pattern is the current hotspot in the fields of planning and design, ecology, etc., which is of great significance for the construction of the overall humanistic ecosystem of the city and optimization of the urban spatial structure. Banan District, as the main part of the eastern eco-city planning of Chongqing Municipality, is a high ground for highlighting the ecological characteristics of Chongqing, realizing effective transformation of ecological value, and promoting the integrated development of urban and rural areas. The analytical methods of land use transfer matrix (GIS) and landscape pattern index (Fragstats) were used to study the characteristics and laws of the evolution of land use landscape pattern in Banan District from 2000 to 2020, which provide some reference value for Banan District to alleviate the ecological contradiction of landscape. The results of the study show that ① Banan District is rich in land use types, of which the area of cultivated land will still account for 57.15% of the total area of the landscape until 2020, accounting for an absolute advantage in land use structure of Banan District; ② From 2000 to 2020, land use conversion in Banan District is characterized as Cropland > woodland > grassland > shrubland > built-up land > water bodies > wetlands, with cropland converted to built-up land being the largest; ③ From 2000 to 2020, the landscape elements of Banan District were distributed in a balanced way, and the landscape types were rich and diversified, but due to the influence of human interference, it also presented the characteristics that the shape of the landscape elements tended to be irregular, and the dominant patches were distributed in a scattered manner, and the patches had poor connectivity. It is recommended that in future regional ecological construction, the layout should be rationally optimized, the relationship between landscape components should be coordinated, the connectivity between landscape patches should be strengthened, and the degree of landscape fragmentation should be reduced.

Keywords: land use transfer, landscape pattern evolution, GIS and Fragstats, Banan district

Procedia PDF Downloads 72
15831 Increasing System Adequacy Using Integration of Pumped Storage: Renewable Energy to Reduce Thermal Power Generations Towards RE100 Target, Thailand

Authors: Mathuravech Thanaphon, Thephasit Nat

Abstract:

The Electricity Generating Authority of Thailand (EGAT) is focusing on expanding its pumped storage hydropower (PSH) capacity to increase the reliability of the system during peak demand and allow for greater integration of renewables. To achieve this requirement, Thailand will have to double its current renewable electricity production. To address the challenges of balancing supply and demand in the grid with increasing levels of RE penetration, as well as rising peak demand, EGAT has already been studying the potential for additional PSH capacity for several years to enable an increased share of RE and replace existing fossil fuel-fired generation. In addition, the role that pumped-storage hydropower would play in fulfilling multiple grid functions and renewable integration. The proposed sites for new PSH would help increase the reliability of power generation in Thailand. However, most of the electricity generation will come from RE, chiefly wind and photovoltaic, and significant additional Energy Storage capacity will be needed. In this paper, the impact of integrating the PSH system on the adequacy of renewable rich power generating systems to reduce the thermal power generating units is investigated. The variations of system adequacy indices are analyzed for different PSH-renewables capacities and storage levels. Power Development Plan 2018 rev.1 (PDP2018 rev.1), which is modified by integrating a six-new PSH system and RE planning and development aftermath in 2030, is the very challenge. The system adequacy indices through power generation are obtained using Multi-Objective Genetic Algorithm (MOGA) Optimization. MOGA is a probabilistic heuristic and stochastic algorithm that is able to find the global minima, which have the advantage that the fitness function does not necessarily require the gradient. In this sense, the method is more flexible in solving reliability optimization problems for a composite power system. The optimization with hourly time step takes years of planning horizon much larger than the weekly horizon that usually sets the scheduling studies. The objective function is to be optimized to maximize RE energy generation, minimize energy imbalances, and minimize thermal power generation using MATLAB. The PDP2018 rev.1 was set to be simulated based on its planned capacity stepping into 2030 and 2050. Therefore, the four main scenario analyses are conducted as the target of renewables share: 1) Business-As-Usual (BAU), 2) National Targets (30% RE in 2030), 3) Carbon Neutrality Targets (50% RE in 2050), and 5) 100% RE or full-decarbonization. According to the results, the generating system adequacy is significantly affected by both PSH-RE and Thermal units. When a PSH is integrated, it can provide hourly capacity to the power system as well as better allocate renewable energy generation to reduce thermal generations and improve system reliability. These results show that a significant level of reliability improvement can be obtained by PSH, especially in renewable-rich power systems.

Keywords: pumped storage hydropower, renewable energy integration, system adequacy, power development planning, RE100, multi-objective genetic algorithm

Procedia PDF Downloads 57
15830 Determination of Biological Efficiency Values of Some Pesticide Application Methods under Second Crop Maize Conditions

Authors: Ali Bolat, Ali Bayat, Mustafa Gullu

Abstract:

Maize can be cultivated both under main and second crop conditions in Turkey. Main pests of maize under second crop conditions are Sesamia nonagrioides Lefebvre (Lepidoptera: Noctuidae) and Ostrinia nubilalis Hübner (Lepidoptera: Crambidae). Aerial spraying applications to control these two main maize pests can be carried out until 2006 in Turkey before it was banned due to environmental concerns like drifting of sprayed pestisides and low biological efficiency. In this context, pulverizers which can spray tall maize plants ( > 175 cm) from the ground have begun to be used. However, the biological efficiency of these sprayers is unknown. Some methods have been tested to increase the success of ground spraying in field experiments conducted in second crop maize in 2008 and 2009. For this aim, 6 spraying methods (air assisted spraying with TX cone jet, domestic cone nozzles, twinjet nozzles, air induction nozzles, standard domestic cone nozzles and tail booms) were used at two application rates (150 and 300 l.ha-1) by a sprayer. In the study, biological efficacy evaluations of each methods were measured in each parcel. Biological efficacy evaluations included counts of number of insect damaged plants, number of holes in stems and live larvae and pupa in stems of selected plants. As a result, the highest biological efficacy value (close to 70%) was obtained from Air Assisted Spraying method at 300 l / ha application volume.

Keywords: air assisted sprayer, drift nozzles, biological efficiency, maize plant

Procedia PDF Downloads 213
15829 Improvement of the Geometric of Dental Bridge Framework through Automatic Program

Authors: Rong-Yang Lai, Jia-Yu Wu, Chih-Han Chang, Yung-Chung Chen

Abstract:

The dental bridge is one of the clinical methods of the treatment for missing teeth. The dental bridge is generally designed for two layers, containing the inner layer of the framework(zirconia) and the outer layer of the porcelain-fused to framework restorations. The design of a conventional bridge is generally based on the antagonist tooth profile so that the framework evenly indented by an equal thickness from outer contour. All-ceramic dental bridge made of zirconia have well demonstrated remarkable potential to withstand a higher physiological occlusal load in posterior region, but it was found that there is still the risk of all-ceramic bridge failure in five years. Thus, how to reduce the incidence of failure is still a problem to be solved. Therefore, the objective of this study is to develop mechanical designs for all-ceramic dental bridges framework by reducing the stress and enhancing fracture resistance under given loading conditions by finite element method. In this study, dental design software is used to design dental bridge based on tooth CT images. After building model, Bi-directional Evolutionary Structural Optimization (BESO) Method algorithm implemented in finite element software was employed to analyze results of finite element software and determine the distribution of the materials in dental bridge; BESO searches the optimum distribution of two different materials, namely porcelain and zirconia. According to the previous calculation of the stress value of each element, when the element stress value is higher than the threshold value, the element would be replaced by the framework material; besides, the difference of maximum stress peak value is less than 0.1%, calculation is complete. After completing the design of dental bridge, the stress distribution of the whole structure is changed. BESO reduces the peak values of principle stress of 10% in outer-layer porcelain and avoids producing tensile stress failure.

Keywords: dental bridge, finite element analysis, framework, automatic program

Procedia PDF Downloads 282
15828 Comparison between XGBoost, LightGBM and CatBoost Using a Home Credit Dataset

Authors: Essam Al Daoud

Abstract:

Gradient boosting methods have been proven to be a very important strategy. Many successful machine learning solutions were developed using the XGBoost and its derivatives. The aim of this study is to investigate and compare the efficiency of three gradient methods. Home credit dataset is used in this work which contains 219 features and 356251 records. However, new features are generated and several techniques are used to rank and select the best features. The implementation indicates that the LightGBM is faster and more accurate than CatBoost and XGBoost using variant number of features and records.

Keywords: gradient boosting, XGBoost, LightGBM, CatBoost, home credit

Procedia PDF Downloads 171
15827 Software Development for AASHTO and Ethiopian Roads Authority Flexible Pavement Design Methods

Authors: Amare Setegn Enyew, Bikila Teklu Wodajo

Abstract:

The primary aim of flexible pavement design is to ensure the development of economical and safe road infrastructure. However, failures can still occur due to improper or erroneous structural design. In Ethiopia, the design of flexible pavements relies on doing calculations manually and selecting pavement structure from catalogue. The catalogue offers, in eight different charts, alternative structures for combinations of traffic and subgrade classes, as outlined in the Ethiopian Roads Authority (ERA) Pavement Design Manual 2001. Furthermore, design modification is allowed in accordance with the structural number principles outlined in the AASHTO 1993 Guide for Design of Pavement Structures. Nevertheless, the manual calculation and design process involves the use of nomographs, charts, tables, and formulas, which increases the likelihood of human errors and inaccuracies, and this may lead to unsafe or uneconomical road construction. To address the challenge, a software called AASHERA has been developed for AASHTO 1993 and ERA design methods, using MATLAB language. The software accurately determines the required thicknesses of flexible pavement surface, base, and subbase layers for the two methods. It also digitizes design inputs and references like nomographs, charts, default values, and tables. Moreover, the software allows easier comparison of the two design methods in terms of results and cost of construction. AASHERA's accuracy has been confirmed through comparisons with designs from handbooks and manuals. The software can aid in reducing human errors, inaccuracies, and time consumption as compared to the conventional manual design methods employed in Ethiopia. AASHERA, with its validated accuracy, proves to be an indispensable tool for flexible pavement structure designers.

Keywords: flexible pavement design, AASHTO 1993, ERA, MATLAB, AASHERA

Procedia PDF Downloads 63
15826 Virtual Customer Integration in Innovation Development: A Systematic Literature Review

Authors: Chau Nguyen Pham Minh

Abstract:

The aim of this study is to answer the following research question: What do we know about virtual customer integration in innovation development based on existing empirical research? The paper is based on a systematic review of 136 articles which were published in the past 16 years. The analysis focuses on three areas: what forms of virtual customer integration (e.g. netnography, online co-creation, virtual experience) have been applied in innovation development; how have virtual customer integration methods effectively been utilized by firms; and what are the influences of virtual customer integration on innovation development activities? Through the detailed analysis, the study provides researchers with broad understanding about virtual customer integration in innovation development. The study shows that practitioners and researchers increasingly pay attention on using virtual customer integration methods in developing innovation since those methods have dominant advantages in interact with customers in order to generate the best ideas for innovation development. Additionally, the findings indicate that netnography has been the most common method in integrating with customers for idea generation; while virtual product experience has been mainly used in product testing. Moreover, the analysis also reveals the positive and negative influences of virtual customer integration in innovation development from both process and strategic perspectives. Most of the review studies examined the phenomenon from company’s perspectives to understand the process of applying virtual customer integration methods and their impacts; however, the customers’ perspective on participating in the virtual interaction has been inadequately studied; therefore, it creates many potential interesting research paths for future studies.

Keywords: innovation, virtual customer integration, co-creation, netnography, new product development

Procedia PDF Downloads 336
15825 Interior Design: Changing Values

Authors: Kika Ioannou Kazamia

Abstract:

This paper examines the action research cycle of the second phase of longitudinal research on sustainable interior design practices, between two groups of stakeholders, designers and clients. During this phase of the action research, the second step - the change stage - of Lewin’s change management model has been utilized to change values, approaches, and attitudes toward sustainable design practices among the participants. Affective domain learning theory is utilized to attach new values. Learning with the use of information technology, collaborative learning, and problem-based learning are the learning methods implemented toward the acquisition of the objectives. Learning methods, and aims, require the design of interventions with participants' involvement in activities that would lead to the acknowledgment of the benefits of sustainable practices. Interventions are steered to measure participants’ decisions for the worth and relevance of ideas, and experiences; accept or commit to a particular stance or action. The data collection methods used in this action research are observers’ reports, participants' questionnaires, and interviews. The data analyses use both quantitative and qualitative methods. The main beneficial aspect of the quantitative method was to provide the means to separate many factors that obscured the main qualitative findings. The qualitative method allowed data to be categorized, to adapt the deductive approach, and then examine for commonalities that could reflect relevant categories or themes. The results from the data indicate that during the second phase, designers and clients' participants altered their behaviours.

Keywords: design, change, sustainability, learning, practices

Procedia PDF Downloads 77
15824 Optimization of Acid Treatments by Assessing Diversion Strategies in Carbonate and Sandstone Formations

Authors: Ragi Poyyara, Vijaya Patnana, Mohammed Alam

Abstract:

When acid is pumped into damaged reservoirs for damage removal/stimulation, distorted inflow of acid into the formation occurs caused by acid preferentially traveling into highly permeable regions over low permeable regions, or (in general) into the path of least resistance. This can lead to poor zonal coverage and hence warrants diversion to carry out an effective placement of acid. Diversion is desirably a reversible technique of temporarily reducing the permeability of high perm zones, thereby forcing the acid into lower perm zones. The uniqueness of each reservoir can pose several challenges to engineers attempting to devise optimum and effective diversion strategies. Diversion techniques include mechanical placement and/or chemical diversion of treatment fluids, further sub-classified into ball sealers, bridge plugs, packers, particulate diverters, viscous gels, crosslinked gels, relative permeability modifiers (RPMs), foams, and/or the use of placement techniques, such as coiled tubing (CT) and the maximum pressure difference and injection rate (MAPDIR) methodology. It is not always realized that the effectiveness of diverters greatly depends on reservoir properties, such as formation type, temperature, reservoir permeability, heterogeneity, and physical well characteristics (e.g., completion type, well deviation, length of treatment interval, multiple intervals, etc.). This paper reviews the mechanisms by which each variety of diverter functions and discusses the effect of various reservoir properties on the efficiency of diversion techniques. Guidelines are recommended to help enhance productivity from zones of interest by choosing the best methods of diversion while pumping an optimized amount of treatment fluid. The success of an overall acid treatment often depends on the effectiveness of the diverting agents.

Keywords: diversion, reservoir, zonal coverage, carbonate, sandstone

Procedia PDF Downloads 432
15823 Implementation of ADETRAN Language Using Message Passing Interface

Authors: Akiyoshi Wakatani

Abstract:

This paper describes the Message Passing Interface (MPI) implementation of ADETRAN language, and its evaluation on SX-ACE supercomputers. ADETRAN language includes pdo statement that specifies the data distribution and parallel computations and pass statement that specifies the redistribution of arrays. Two methods for implementation of pass statement are discussed and the performance evaluation using Splitting-Up CG method is presented. The effectiveness of the parallelization is evaluated and the advantage of one dimensional distribution is empirically confirmed by using the results of experiments.

Keywords: iterative methods, array redistribution, translator, distributed memory

Procedia PDF Downloads 269
15822 Comparative Analysis of Reinforcement Learning Algorithms for Autonomous Driving

Authors: Migena Mana, Ahmed Khalid Syed, Abdul Malik, Nikhil Cherian

Abstract:

In recent years, advancements in deep learning enabled researchers to tackle the problem of self-driving cars. Car companies use huge datasets to train their deep learning models to make autonomous cars a reality. However, this approach has certain drawbacks in that the state space of possible actions for a car is so huge that there cannot be a dataset for every possible road scenario. To overcome this problem, the concept of reinforcement learning (RL) is being investigated in this research. Since the problem of autonomous driving can be modeled in a simulation, it lends itself naturally to the domain of reinforcement learning. The advantage of this approach is that we can model different and complex road scenarios in a simulation without having to deploy in the real world. The autonomous agent can learn to drive by finding the optimal policy. This learned model can then be easily deployed in a real-world setting. In this project, we focus on three RL algorithms: Q-learning, Deep Deterministic Policy Gradient (DDPG), and Proximal Policy Optimization (PPO). To model the environment, we have used TORCS (The Open Racing Car Simulator), which provides us with a strong foundation to test our model. The inputs to the algorithms are the sensor data provided by the simulator such as velocity, distance from side pavement, etc. The outcome of this research project is a comparative analysis of these algorithms. Based on the comparison, the PPO algorithm gives the best results. When using PPO algorithm, the reward is greater, and the acceleration, steering angle and braking are more stable compared to the other algorithms, which means that the agent learns to drive in a better and more efficient way in this case. Additionally, we have come up with a dataset taken from the training of the agent with DDPG and PPO algorithms. It contains all the steps of the agent during one full training in the form: (all input values, acceleration, steering angle, break, loss, reward). This study can serve as a base for further complex road scenarios. Furthermore, it can be enlarged in the field of computer vision, using the images to find the best policy.

Keywords: autonomous driving, DDPG (deep deterministic policy gradient), PPO (proximal policy optimization), reinforcement learning

Procedia PDF Downloads 148
15821 The Application of Pareto Local Search to the Single-Objective Quadratic Assignment Problem

Authors: Abdullah Alsheddy

Abstract:

This paper presents the employment of Pareto optimality as a strategy to help (single-objective) local search escaping local optima. Instead of local search, Pareto local search is applied to solve the quadratic assignment problem which is multi-objectivized by adding a helper objective. The additional objective is defined as a function of the primary one with augmented penalties that are dynamically updated.

Keywords: Pareto optimization, multi-objectivization, quadratic assignment problem, local search

Procedia PDF Downloads 466
15820 Study of Individual Parameters on the Enzymatic Glycosidation of Betulinic Acid by Novozyme-435

Authors: A. U. Adamu, Hamisu Abdu, A. A. Saidu

Abstract:

The enzymatic synthesis of 3-O-β-D-glucopyranoside-betulinic acid using Novozyme-435 as a catalyst was studied. The effect of various parameters such as substrate molar ratio, reaction temperature, reaction time, re-used enzymes and amount of enzymes were investigated. The optimum rection conditions for the enzymatic glycosidation of betulinic acid in an organic solvent using Novozym-435 was found to be at 1:1.2 substrate molar ratio, 55oC, 24 h and 180 mg of enzymes with percentage conversion of 88.69 %.

Keywords: betulinic acid, glycosidation, novozyme-435, optimization

Procedia PDF Downloads 426