Search results for: multi item inventory problem
11156 Robust Single/Multi bit Memristor Based Memory
Authors: Ahmed Emara, Maged Ghoneima, Mohamed Dessouky
Abstract:
Demand for low power fast memories is increasing with the increase in IC’s complexity, in this paper we introduce a proposal for a compact SRAM based on memristor devices. The compact size of the proposed cell (1T2M compared to 6T of traditional SRAMs) allows denser memories on the same area. In this paper, we will discuss the proposed memristor memory cell for single/multi bit data storing configurations along with the writing and reading operations. Stored data stability across successive read operation will be illustrated, operational simulation results and a comparison of our proposed design with previously conventional SRAM and previously proposed memristor cells will be provided.Keywords: memristor, multi-bit, single-bit, circuits, systems
Procedia PDF Downloads 37411155 An Alternative Framework of Multi-Resolution Nested Weighted Essentially Non-Oscillatory Schemes for Solving Euler Equations with Adaptive Order
Authors: Zhenming Wang, Jun Zhu, Yuchen Yang, Ning Zhao
Abstract:
In the present paper, an alternative framework is proposed to construct a class of finite difference multi-resolution nested weighted essentially non-oscillatory (WENO) schemes with an increasingly higher order of accuracy for solving inviscid Euler equations. These WENO schemes firstly obtain a set of reconstruction polynomials by a hierarchy of nested central spatial stencils, and then recursively achieve a higher order approximation through the lower-order precision WENO schemes. The linear weights of such WENO schemes can be set as any positive numbers with a requirement that their sum equals one and they will not pollute the optimal order of accuracy in smooth regions and could simultaneously suppress spurious oscillations near discontinuities. Numerical results obtained indicate that these alternative finite-difference multi-resolution nested WENO schemes with different accuracies are very robust with low dissipation and use as few reconstruction stencils as possible while maintaining the same efficiency, achieving the high-resolution property without any equivalent multi-resolution representation. Besides, its finite volume form is easier to implement in unstructured grids.Keywords: finite-difference, WENO schemes, high order, inviscid Euler equations, multi-resolution
Procedia PDF Downloads 14611154 Multi-Criteria Evaluation of Integrated Renewable Energy Systems for Community-Scale Applications
Authors: Kuanrong Qiu, Sebnem Madrali, Evgueniy Entchev
Abstract:
To achieve the satisfactory objectives in deploying integrated renewable energy systems, it is crucial to consider all the related parameters affecting the design and decision-making. The multi-criteria evaluation method is a reliable and efficient tool for achieving the most appropriate solution. The approach considers the influential factors and their relative importance in prioritizing the alternatives. In this paper, a multi-criteria decision framework, based on the criteria including technical, economic, environmental and reliability, is developed to evaluate and prioritize renewable energy technologies and configurations of their integrated systems for community applications, identify their viability, and thus support the adoption of the clean energy technologies and the decision-making regarding energy transitions and transition patterns. Case studies for communities in Canada show that resource availability and the configurations of the integrated systems significantly impact the economic performance and environmental performance.Keywords: multi-criteria, renewables, integrated energy systems, decision-making, model
Procedia PDF Downloads 9411153 Metaheuristics to Solve Tasks Scheduling
Authors: Rachid Ziteuni, Selt Omar
Abstract:
In this paper, we propose a new polynomial metaheuristic elaboration (tabu search) for solving scheduling problems. This method allows us to solve the scheduling problem of n tasks on m identical parallel machines with unavailability periods. This problem is NP-complete in the strong sens and finding an optimal solution appears unlikely. Note that all data in this problem are integer and deterministic. The performance criterion to optimize in this problem which we denote Pm/N-c/summs of (wjCj) is the weighted sum of the end dates of tasks.Keywords: scheduling, parallel identical machines, unavailability periods, metaheuristic, tabu search
Procedia PDF Downloads 33111152 The effect of Reflective Thinking on Iranian EFL Learners’ Language Learning Strategy Use, L2 Proficiency, and Beliefs about Second Language Learning and Teaching
Authors: Mohammad Hadi Mahmoodi, Mojtaba Farahani
Abstract:
The present study aimed at investigating whether reflective thinking differentiates Iranian EFL learners regarding language learning strategy use, beliefs about language learning and teaching, and L2 proficiency. To this end, the researcher adopted a mixed method approach. First, 94 EFL learners were asked to complete Reflective Thinking Questionnaire (Kember et al., 2000), Beliefs about Language Learning and Teaching Inventory (Horwitz, 1985), Strategy Inventory for Language Learning (Oxford, 1990), and Oxford Quick Placement Test. The results of three separate one-way ANOVAs indicated that reflective thinking significantly differentiates Iranian EFL learners concerning: (a)language learning strategy use, (b) beliefs about language learning and teaching, and (c) general language proficiency. Furthermore, to see where the differences lay, three separate post-hoc Tukey tests were run the results of which showed that learners with different levels of reflectivity (high, mid, and low) were significantly different from each other in all three dependent variables. Finally, to increase the validity of the findings thirty of the participants were interviewed and the results were analyzed through template organizing style method (Crabtree & Miller, 1999). The results of the interview analysis supported the results of quantitative data analysis.Keywords: reflective thinking, language learning strategy use, beliefs toward language learning and teaching
Procedia PDF Downloads 65611151 Developing and Testing a Questionnaire of Music Memorization and Practice
Authors: Diana Santiago, Tania Lisboa, Sophie Lee, Alexander P. Demos, Monica C. S. Vasconcelos
Abstract:
Memorization has long been recognized as an arduous and anxiety-evoking task for musicians, and yet, it is an essential aspect of performance. Research shows that musicians are often not taught how to memorize. While memorization and practice strategies of professionals have been studied, little research has been done to examine how student musicians learn to practice and memorize music in different cultural settings. We present the process of developing and testing a questionnaire of music memorization and musical practice for student musicians in the UK and Brazil. A survey was developed for a cross-cultural research project aiming at examining how young orchestral musicians (aged 7–18 years) in different learning environments and cultures engage in instrumental practice and memorization. The questionnaire development included members of a UK/US/Brazil research team of music educators and performance science researchers. A pool of items was developed for each aspect of practice and memorization identified, based on literature, personal experiences, and adapted from existing questionnaires. Item development took the varying levels of cognitive and social development of the target populations into consideration. It also considered the diverse target learning environments. Items were initially grouped in accordance with a single underlying construct/behavior. The questionnaire comprised three sections: a demographics section, a section on practice (containing 29 items), and a section on memorization (containing 40 items). Next, the response process was considered and a 5-point Likert scale ranging from ‘always’ to ‘never’ with a verbal label and an image assigned to each response option was selected, following effective questionnaire design for children and youths. Finally, a pilot study was conducted with young orchestral musicians from diverse learning environments in Brazil and the United Kingdom. Data collection took place in either one-to-one or group settings to facilitate the participants. Cognitive interviews were utilized to establish response process validity by confirming the readability and accurate comprehension of the questionnaire items or highlighting the need for item revision. Internal reliability was investigated by measuring the consistency of the item groups using the statistical test Cronbach’s alpha. The pilot study successfully relied on the questionnaire to generate data about the engagement of young musicians of different levels and instruments, across different learning and cultural environments, in instrumental practice and memorization. Interaction analysis of the cognitive interviews undertaken with these participants, however, exposed the fact that certain items, and the response scale, could be interpreted in multiple ways. The questionnaire text was, therefore, revised accordingly. The low Cronbach’s Alpha scores of many item groups indicated another issue with the original questionnaire: its low level of internal reliability. Several reasons for each poor reliability can be suggested, including the issues with item interpretation revealed through interaction analysis of the cognitive interviews, the small number of participants (34), and the elusive nature of the construct in question. The revised questionnaire measures 78 specific behaviors or opinions. It can be seen to provide an efficient means of gathering information about the engagement of young musicians in practice and memorization on a large scale.Keywords: cross-cultural, memorization, practice, questionnaire, young musicians
Procedia PDF Downloads 12311150 Fear of Childbirth According to Parity
Authors: Ozlem Karabulutlu, Kiymet Yesilcicek Calik, Nazli Akar
Abstract:
Objectives: To examine fear of childbirth according to parity, gestational age, prenatal education, and obstetric history. Methods: The study was performed as a questionnaire design in a State Hospital in Kars, Turkey with 403 unselected pregnant who were recruited from the delivery unit. The data were collected via 3 questionnaires; the first with sociodemographic and obstetric features, the second with Wijma Delivery Expectance/Experience Questionnaire (W-DEQ) scale, and the third with the scale of Beck Anxiety Inventory (BAI). Results: The W-DEQ and BAI scores were higher in nulliparous than multiparous woman (W-DEQ 67.08±28.33, 59.87±26.91, P=0.039<0.05, BAI 18.97±9.5, 16.65±11.83, P=0.0009<0.05 respectively). Moreover, W-DEQ and BAI scores of pregnant whose gestational week was ≤37 / ≥41 and who didn’t receive training and had vaginal delivery was higher than those whose gestational week was 38-40 weeks and who received prenatal training and had cesarean delivery (W-DEQ 67.54±29.20, 56.44±22.59, 69.72±25.53 p<0.05, BAI 21.41±9.07; 15.77±11.20, 18.36±10.57 p<0.05 respectively). Both in nulliparous and multiparous, as W-DEQ score increases BAI score increases too (r=0.256; p=0.000<0.05). Conclusions: Severe fear of childbirth and anxiety was more common in nulliparous women, preterm and post-term pregnancy and who did not receive prenatal training and had vaginal delivery.Keywords: Beck Anxiety Inventory (BAI), fear of birth, parity, pregnant women, Wijma Delivery Expectance/Experience Questionnaire (W-DEQ)
Procedia PDF Downloads 29111149 Multi-Temporal Urban Land Cover Mapping Using Spectral Indices
Authors: Mst Ilme Faridatul, Bo Wu
Abstract:
Multi-temporal urban land cover mapping is of paramount importance for monitoring urban sprawl and managing the ecological environment. For diversified urban activities, it is challenging to map land covers in a complex urban environment. Spectral indices have proved to be effective for mapping urban land covers. To improve multi-temporal urban land cover classification and mapping, we evaluate the performance of three spectral indices, e.g. modified normalized difference bare-land index (MNDBI), tasseled cap water and vegetation index (TCWVI) and shadow index (ShDI). The MNDBI is developed to evaluate its performance of enhancing urban impervious areas by separating bare lands. A tasseled cap index, TCWVI is developed to evaluate its competence to detect vegetation and water simultaneously. The ShDI is developed to maximize the spectral difference between shadows of skyscrapers and water and enhance water detection. First, this paper presents a comparative analysis of three spectral indices using Landsat Enhanced Thematic Mapper (ETM), Thematic Mapper (TM) and Operational Land Imager (OLI) data. Second, optimized thresholds of the spectral indices are imputed to classify land covers, and finally, their performance of enhancing multi-temporal urban land cover mapping is assessed. The results indicate that the spectral indices are competent to enhance multi-temporal urban land cover mapping and achieves an overall classification accuracy of 93-96%.Keywords: land cover, mapping, multi-temporal, spectral indices
Procedia PDF Downloads 15311148 An Experiential Learning of Ontology-Based Multi-document Summarization by Removal Summarization Techniques
Authors: Pranjali Avinash Yadav-Deshmukh
Abstract:
Remarkable development of the Internet along with the new technological innovation, such as high-speed systems and affordable large storage space have led to a tremendous increase in the amount and accessibility to digital records. For any person, studying of all these data is tremendously time intensive, so there is a great need to access effective multi-document summarization (MDS) systems, which can successfully reduce details found in several records into a short, understandable summary or conclusion. For semantic representation of textual details in ontology area, as a theoretical design, our system provides a significant structure. The stability of using the ontology in fixing multi-document summarization problems in the sector of catastrophe control is finding its recommended design. Saliency ranking is usually allocated to each phrase and phrases are rated according to the ranking, then the top rated phrases are chosen as the conclusion. With regards to the conclusion quality, wide tests on a selection of media announcements are appropriate for “Jammu Kashmir Overflow in 2014” records. Ontology centered multi-document summarization methods using “NLP centered extraction” outshine other baselines. Our participation in recommended component is to implement the details removal methods (NLP) to enhance the results.Keywords: disaster management, extraction technique, k-means, multi-document summarization, NLP, ontology, sentence extraction
Procedia PDF Downloads 38811147 NFResNet: Multi-Scale and U-Shaped Networks for Deblurring
Authors: Tanish Mittal, Preyansh Agrawal, Esha Pahwa, Aarya Makwana
Abstract:
Multi-Scale and U-shaped Networks are widely used in various image restoration problems, including deblurring. Keeping in mind the wide range of applications, we present a comparison of these architectures and their effects on image deblurring. We also introduce a new block called as NFResblock. It consists of a Fast Fourier Transformation layer and a series of modified Non-Linear Activation Free Blocks. Based on these architectures and additions, we introduce NFResnet and NFResnet+, which are modified multi-scale and U-Net architectures, respectively. We also use three differ-ent loss functions to train these architectures: Charbonnier Loss, Edge Loss, and Frequency Reconstruction Loss. Extensive experiments on the Deep Video Deblurring dataset, along with ablation studies for each component, have been presented in this paper. The proposed architectures achieve a considerable increase in Peak Signal to Noise (PSNR) ratio and Structural Similarity Index (SSIM) value.Keywords: multi-scale, Unet, deblurring, FFT, resblock, NAF-block, nfresnet, charbonnier, edge, frequency reconstruction
Procedia PDF Downloads 13811146 A New Approach for Generalized First Derivative of Nonsmooth Functions Using Optimization
Authors: Mohammad Mehdi Mazarei, Ali Asghar Behroozpoor
Abstract:
In this paper, we define an optimization problem corresponding to smooth and nonsmooth functions which its optimal solution is the first derivative of these functions in a domain. For this purpose, a linear programming problem corresponding to optimization problem is obtained. The optimal solution of this linear programming problem is the approximate generalized first derivative. In fact, we approximate generalized first derivative of nonsmooth functions as tailor series. We show the efficiency of our approach by some smooth and nonsmooth functions in some examples.Keywords: general derivative, linear programming, optimization problem, smooth and nonsmooth functions
Procedia PDF Downloads 55711145 A Novel Meta-Heuristic Algorithm Based on Cloud Theory for Redundancy Allocation Problem under Realistic Condition
Authors: H. Mousavi, M. Sharifi, H. Pourvaziri
Abstract:
Redundancy Allocation Problem (RAP) is a well-known mathematical problem for modeling series-parallel systems. It is a combinatorial optimization problem which focuses on determining an optimal assignment of components in a system design. In this paper, to be more practical, we have considered the problem of redundancy allocation of series system with interval valued reliability of components. Therefore, during the search process, the reliabilities of the components are considered as a stochastic variable with a lower and upper bounds. In order to optimize the problem, we proposed a simulated annealing based on cloud theory (CBSAA). Also, the Monte Carlo simulation (MCS) is embedded to the CBSAA to handle the random variable components’ reliability. This novel approach has been investigated by numerical examples and the experimental results have shown that the CBSAA combining MCS is an efficient tool to solve the RAP of systems with interval-valued component reliabilities.Keywords: redundancy allocation problem, simulated annealing, cloud theory, monte carlo simulation
Procedia PDF Downloads 41311144 Ultra-Reliable Low Latency V2X Communication for Express Way Using Multiuser Scheduling Algorithm
Authors: Vaishali D. Khairnar
Abstract:
The main aim is to provide lower-latency and highly reliable communication facilities for vehicles in the automobile industry; vehicle-to-everything (V2X) communication basically intends to increase expressway road security and its effectiveness. The Ultra-Reliable Low-Latency Communications (URLLC) algorithm and cellular networks are applied in combination with Mobile Broadband (MBB). This is particularly used in express way safety-based driving applications. Expressway vehicle drivers (humans) will communicate in V2X systems using the sixth-generation (6G) communication systems which have very high-speed mobility features. As a result, we need to determine how to ensure reliable and consistent wireless communication links and improve the quality to increase channel gain, which is becoming a challenge that needs to be addressed. To overcome this challenge, we proposed a unique multi-user scheduling algorithm for ultra-massive multiple-input multiple-output (MIMO) systems using 6G. In wideband wireless network access in case of high traffic and also in medium traffic conditions, moreover offering quality-of-service (QoS) to distinct service groups with synchronized contemporaneous traffic on the highway like the Mumbai-Pune expressway becomes a critical problem. Opportunist MAC (OMAC) is a way of proposing communication across a wireless communication link that can change in space and time and might overcome the above-mentioned challenge. Therefore, a multi-user scheduling algorithm is proposed for MIMO systems using a cross-layered MAC protocol to achieve URLLC and high reliability in V2X communication.Keywords: ultra-reliable low latency communications, vehicle-to-everything communication, multiple-input multiple-output systems, multi-user scheduling algorithm
Procedia PDF Downloads 9011143 A Genetic Algorithm Based Permutation and Non-Permutation Scheduling Heuristics for Finite Capacity Material Requirement Planning Problem
Authors: Watchara Songserm, Teeradej Wuttipornpun
Abstract:
This paper presents a genetic algorithm based permutation and non-permutation scheduling heuristics (GAPNP) to solve a multi-stage finite capacity material requirement planning (FCMRP) problem in automotive assembly flow shop with unrelated parallel machines. In the algorithm, the sequences of orders are iteratively improved by the GA characteristics, whereas the required operations are scheduled based on the presented permutation and non-permutation heuristics. Finally, a linear programming is applied to minimize the total cost. The presented GAPNP algorithm is evaluated by using real datasets from automotive companies. The required parameters for GAPNP are intently tuned to obtain a common parameter setting for all case studies. The results show that GAPNP significantly outperforms the benchmark algorithm about 30% on average.Keywords: capacitated MRP, genetic algorithm, linear programming, automotive industries, flow shop, application in industry
Procedia PDF Downloads 49011142 Agile Project Management: A Real Application in a Multi-Project Research and Development Center
Authors: Aysegul Sarac
Abstract:
The aim of this study is to analyze the impacts of integrating agile development principles and practices, in particular to reduce project lead time in a multi-project environment. We analyze Arçelik Washing Machine R&D Center in which multiple projects are conducted by shared resources. In the first part of the study, we illustrate the current waterfall model system by using a value stream map. We define all activities starting from the first idea of the project to the customer and measure process time and lead time of projects. In the second part of the study we estimate potential improvements and select a set of these improvements to integrate agile principles. We aim to develop a future state map and analyze the impacts of integrating lean principles on project lead time. The main contribution of this study is that we analyze and integrate agile product development principles in a real multi-project system.Keywords: agile project management, multi project system, project lead time, product development
Procedia PDF Downloads 30611141 An Approach To Flatten The Gain Of Fiber Raman Amplifiers With Multi-Pumping
Authors: Surinder Singh, Adish Bindal
Abstract:
The effects of the pumping wavelength and their power on the gain flattening of a fiber Raman amplifier (FRA) are investigated. The multi-wavelength pumping scheme is utilized to achieve gain flatness in FRA. It is proposed that gain flatness becomes better with increase in number of pumping wavelengths applied. We have achieved flat gain with 0.27 dB fluctuation in a spectral range of 1475-1600 nm for a Raman fiber length of 10 km by using six pumps with wavelengths with in the 1385-1495 nm interval. The effect of multi-wavelength pumping scheme on gain saturation in FRA is also studied. It is proposed that gain saturation condition gets improved by using this scheme and this scheme is more useful for higher spans of Raman fiber length.Keywords: FRA, WDM, pumping, flat gain
Procedia PDF Downloads 47911140 Toward a Risk Assessment Model Based on Multi-Agent System for Cloud Consumer
Authors: Saadia Drissi
Abstract:
The cloud computing is an innovative paradigm that introduces several changes in technology that have resulted a new ways for cloud providers to deliver their services to cloud consumers mainly in term of security risk assessment, thus, adapting a current risk assessment tools to cloud computing is a very difficult task due to its several characteristics that challenge the effectiveness of risk assessment approaches. As consequence, there is a need of risk assessment model adapted to cloud computing. This paper requires a new risk assessment model based on multi-agent system and AHP model as fundamental steps towards the development of flexible risk assessment approach regarding cloud consumers.Keywords: cloud computing, risk assessment model, multi-agent system, AHP model, cloud consumer
Procedia PDF Downloads 54511139 Diversity of Arachnological Fauna in an Agricultural Environment: Inventory and Effect of Herbicides
Authors: Benslimane Marwa, Benabbas-Sahki Ilham
Abstract:
Spiders play an important role in agroecosystems due to their great abundance. They are considered a valuable group of invertebrates in agricultural land. They are predators of insects harmful to crops, but their use in biological control requires in-depth research on their ecology. During our study, we counted a total of 768 spiders, which we were able to identify and classify into 14 families over a period between March 2021 and October of the same year. This study aims to compare a station subjected to agricultural practices, including the spreading of herbicides, with another station subjected to the same practices but without the use of phytosanitary products. The inventory shows a strong dominance of the Gnaphosidae family (75.8%). This result affirms that the proliferation of this family is very favorable to the knowledge of the fruits by limiting the populations of aphids infesting the plot, which can therefore be proposed for biological control. The comparative study of the populations of spiders in the stations studied shows the negative effect of agricultural practices on the species richness and abundance of these species; as for the diversity, this one is only slightly affected. Finally, we can note that the effects of herbicides did not cause a significant imbalance in this agroecosystem, unlike plowing, which showed harmful consequences on spiders.Keywords: spiders, predator, species richness, herbicides, agricultural practices
Procedia PDF Downloads 9211138 Space Telemetry Anomaly Detection Based On Statistical PCA Algorithm
Authors: Bassem Nassar, Wessam Hussein, Medhat Mokhtar
Abstract:
The crucial concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems in order to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important in order to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the aforementioned problem coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions and the results shows that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.Keywords: space telemetry monitoring, multivariate analysis, PCA algorithm, space operations
Procedia PDF Downloads 41611137 A Column Generation Based Algorithm for Airline Cabin Crew Rostering Problem
Authors: Nan Xu
Abstract:
In airlines, the crew scheduling problem is usually decomposed into two stages: crew pairing and crew rostering. In the crew pairing stage, pairings are generated such that each flight is covered by exactly one pairing and the overall cost is minimized. In the crew rostering stage, the pairings generated in the crew pairing stage are combined with off days, training and other breaks to create individual work schedules. The paper focuses on cabin crew rostering problem, which is challenging due to the extremely large size and the complex working rules involved. In our approach, the objective of rostering consists of two major components. The first is to minimize the number of unassigned pairings and the second is to ensure the fairness to crew members. There are two measures of fairness to crew members, the number of overnight duties and the total fly-hour over a given period. Pairings should be assigned to each crew member so that their actual overnight duties and fly hours are as close to the expected average as possible. Deviations from the expected average are penalized in the objective function. Since several small deviations are preferred than a large deviation, the penalization is quadratic. Our model of the airline crew rostering problem is based on column generation. The problem is decomposed into a master problem and subproblems. The mater problem is modeled as a set partition problem and exactly one roster for each crew is picked up such that the pairings are covered. The restricted linear master problem (RLMP) is considered. The current subproblem tries to find columns with negative reduced costs and add them to the RLMP for the next iteration. When no column with negative reduced cost can be found or a stop criteria is met, the procedure ends. The subproblem is to generate feasible crew rosters for each crew member. A separate acyclic weighted graph is constructed for each crew member and the subproblem is modeled as resource constrained shortest path problems in the graph. Labeling algorithm is used to solve it. Since the penalization is quadratic, a method to deal with non-additive shortest path problem using labeling algorithm is proposed and corresponding domination condition is defined. The major contribution of our model is: 1) We propose a method to deal with non-additive shortest path problem; 2) Operation to allow relaxing some soft rules is allowed in our algorithm, which can improve the coverage rate; 3) Multi-thread techniques are used to improve the efficiency of the algorithm when generating Line-of-Work for crew members. Here a column generation based algorithm for the airline cabin crew rostering problem is proposed. The objective is to assign a personalized roster to crew member which minimize the number of unassigned pairings and ensure the fairness to crew members. The algorithm we propose in this paper has been put into production in a major airline in China and numerical experiments show that it has a good performance.Keywords: aircrew rostering, aircrew scheduling, column generation, SPPRC
Procedia PDF Downloads 14711136 Approaching the Spatial Multi-Objective Land Use Planning Problems at Mountain Areas by a Hybrid Meta-Heuristic Optimization Technique
Authors: Konstantinos Tolidis
Abstract:
The mountains are amongst the most fragile environments in the world. The world’s mountain areas cover 24% of the Earth’s land surface and are home to 12% of the global population. A further 14% of the global population is estimated to live in the vicinity of their surrounding areas. As urbanization continues to increase in the world, the mountains are also key centers for recreation and tourism; their attraction is often heightened by their remarkably high levels of biodiversity. Due to the fact that the features in mountain areas vary spatially (development degree, human geography, socio-economic reality, relations of dependency and interaction with other areas-regions), the spatial planning on these areas consists of a crucial process for preserving the natural, cultural and human environment and consists of one of the major processes of an integrated spatial policy. This research has been focused on the spatial decision problem of land use allocation optimization which is an ordinary planning problem on the mountain areas. It is a matter of fact that such decisions must be made not only on what to do, how much to do, but also on where to do, adding a whole extra class of decision variables to the problem when combined with the consideration of spatial optimization. The utility of optimization as a normative tool for spatial problem is widely recognized. However, it is very difficult for planners to quantify the weights of the objectives especially when these are related to mountain areas. Furthermore, the land use allocation optimization problems at mountain areas must be addressed not only by taking into account the general development objectives but also the spatial objectives (e.g. compactness, compatibility and accessibility, etc). Therefore, the main research’s objective was to approach the land use allocation problem by utilizing a hybrid meta-heuristic optimization technique tailored to the mountain areas’ spatial characteristics. The results indicates that the proposed methodological approach is very promising and useful for both generating land use alternatives for further consideration in land use allocation decision-making and supporting spatial management plans at mountain areas.Keywords: multiobjective land use allocation, mountain areas, spatial planning, spatial decision making, meta-heuristic methods
Procedia PDF Downloads 34711135 Performance Evaluation of Routing Protocol in Cognitive Radio with Multi Technological Environment
Authors: M. Yosra, A. Mohamed, T. Sami
Abstract:
Over the past few years, mobile communication technologies have seen significant evolution. This fact promoted the implementation of many systems in a multi-technological setting. From one system to another, the Quality of Service (QoS) provided to mobile consumers gets better. The growing number of normalized standards extends the available services for each consumer, moreover, most of the available radio frequencies have already been allocated, such as 3G, Wifi, Wimax, and LTE. A study by the Federal Communications Commission (FCC) found that certain frequency bands are partially occupied in particular locations and times. So, the idea of Cognitive Radio (CR) is to share the spectrum between a primary user (PU) and a secondary user (SU). The main objective of this spectrum management is to achieve a maximum rate of exploitation of the radio spectrum. In general, the CR can greatly improve the quality of service (QoS) and improve the reliability of the link. The problem will reside in the possibility of proposing a technique to improve the reliability of the wireless link by using the CR with some routing protocols. However, users declared that the links were unreliable and that it was an incompatibility with QoS. In our case, we choose the QoS parameter "bandwidth" to perform a supervised classification. In this paper, we propose a comparative study between some routing protocols, taking into account the variation of different technologies on the existing spectral bandwidth like 3G, WIFI, WIMAX, and LTE. Due to the simulation results, we observe that LTE has significantly higher availability bandwidth compared with other technologies. The performance of the OLSR protocol is better than other on-demand routing protocols (DSR, AODV and DSDV), in LTE technology because of the proper receiving of packets, less packet drop and the throughput. Numerous simulations of routing protocols have been made using simulators such as NS3.Keywords: cognitive radio, multi technology, network simulator (NS3), routing protocol
Procedia PDF Downloads 6311134 Impact of Working Capital Management Strategies on Firm's Value and Profitability
Authors: Jonghae Park, Daesung Kim
Abstract:
The impact of aggressive and conservative working capital‘s strategies on the value and profitability of the firms has been evaluated by applying the panel data regression analysis. The control variables used in the regression models are natural log of firm size, sales growth, and debt. We collected a panel of 13,988 companies listed on the Korea stock market covering the period 2000-2016. The major findings of this study are as follow: 1) We find a significant negative correlation between firm profitability and the number of days inventory (INV) and days accounts payable (AP). The firm’s profitability can also be improved by reducing the number of days of inventory and days accounts payable. 2) We also find a significant positive correlation between firm profitability and the number of days accounts receivable (AR) and cash ratios (CR). In other words, the cash is associated with high corporate profitability. 3) Tobin's analysis showed that only the number of days accounts receivable (AR) and cash ratios (CR) had a significant relationship. In conclusion, companies can increase profitability by reducing INV and increasing AP, but INV and AP did not affect corporate value. In particular, it is necessary to increase CA and decrease AR in order to increase Firm’s profitability and value.Keywords: working capital, working capital management, firm value, profitability
Procedia PDF Downloads 19211133 Rd-PLS Regression: From the Analysis of Two Blocks of Variables to Path Modeling
Authors: E. Tchandao Mangamana, V. Cariou, E. Vigneau, R. Glele Kakai, E. M. Qannari
Abstract:
A new definition of a latent variable associated with a dataset makes it possible to propose variants of the PLS2 regression and the multi-block PLS (MB-PLS). We shall refer to these variants as Rd-PLS regression and Rd-MB-PLS respectively because they are inspired by both Redundancy analysis and PLS regression. Usually, a latent variable t associated with a dataset Z is defined as a linear combination of the variables of Z with the constraint that the length of the loading weights vector equals 1. Formally, t=Zw with ‖w‖=1. Denoting by Z' the transpose of Z, we define herein, a latent variable by t=ZZ’q with the constraint that the auxiliary variable q has a norm equal to 1. This new definition of a latent variable entails that, as previously, t is a linear combination of the variables in Z and, in addition, the loading vector w=Z’q is constrained to be a linear combination of the rows of Z. More importantly, t could be interpreted as a kind of projection of the auxiliary variable q onto the space generated by the variables in Z, since it is collinear to the first PLS1 component of q onto Z. Consider the situation in which we aim to predict a dataset Y from another dataset X. These two datasets relate to the same individuals and are assumed to be centered. Let us consider a latent variable u=YY’q to which we associate the variable t= XX’YY’q. Rd-PLS consists in seeking q (and therefore u and t) so that the covariance between t and u is maximum. The solution to this problem is straightforward and consists in setting q to the eigenvector of YY’XX’YY’ associated with the largest eigenvalue. For the determination of higher order components, we deflate X and Y with respect to the latent variable t. Extending Rd-PLS to the context of multi-block data is relatively easy. Starting from a latent variable u=YY’q, we consider its ‘projection’ on the space generated by the variables of each block Xk (k=1, ..., K) namely, tk= XkXk'YY’q. Thereafter, Rd-MB-PLS seeks q in order to maximize the average of the covariances of u with tk (k=1, ..., K). The solution to this problem is given by q, eigenvector of YY’XX’YY’, where X is the dataset obtained by horizontally merging datasets Xk (k=1, ..., K). For the determination of latent variables of order higher than 1, we use a deflation of Y and Xk with respect to the variable t= XX’YY’q. In the same vein, extending Rd-MB-PLS to the path modeling setting is straightforward. Methods are illustrated on the basis of case studies and performance of Rd-PLS and Rd-MB-PLS in terms of prediction is compared to that of PLS2 and MB-PLS.Keywords: multiblock data analysis, partial least squares regression, path modeling, redundancy analysis
Procedia PDF Downloads 14711132 Evaluation of Multi-Sectoral Schistosomiasis Control in Indonesia
Authors: Hayani Anastasia, Junus Widjaja, Anis Nur Widayati
Abstract:
In Indonesia, schistosomiasis is caused by Schistosoma japonicum with Oncomelania hupensis lindoensis as the intermediate host. Schistosomiasis can infect humans and all species of mammals. In order to achieve schistosomiasis elimination by 2020, schistosomiasis control, including environmental management, has been carried out by multi-sector. A cross-sectional study was conducted in 2018 to evaluate the multi-sectoral schistosomiasis control program. Data were collected by depth interviews of stakeholders, stool surveys, snail surveys, observation, and document reviews. About 53.6% of control programs in the schistosomiasis control roadmap were not achieved. The number of foci area found in 2018 are not significantly different compared to before the control programs. Moreover, the prevalence of schistosomiasis in the human was 0-5.1% and in mammals was the range from 0 to 10%. In order to overcome the problems, a policy about schistosomiasis as a priority program in ministries and agencies other than the Ministry of Health is needed. Innovative health promotion with interactive media also needs to be applied. Also, the schistosomiasis work team needs to be more active with the Agency of Regional Development as the leading sector.Keywords: evaluation, Indonesia, multi-sector, schistosomiasis
Procedia PDF Downloads 13311131 A Survey on Traditional Mac Layer Protocols in Cognitive Wireless Mesh Networks
Authors: Anusha M., V. Srikanth
Abstract:
Maximizing spectrum usage and numerous applications of the wireless communication networks have forced to a high interest of available spectrum. Cognitive Radio control its receiver and transmitter features exactly so that they can utilize the vacant approved spectrum without impacting the functionality of the principal licensed users. The Use of various channels assists to address interferences thereby improves the whole network efficiency. The MAC protocol in cognitive radio network explains the spectrum usage by interacting with multiple channels among the users. In this paper we studied about the architecture of cognitive wireless mesh network and traditional TDMA dependent MAC method to allocate channels dynamically. The majority of the MAC protocols suggested in the research are operated on Common-Control-Channel (CCC) to handle the services between Cognitive Radio secondary users. In this paper, an extensive study of Multi-Channel Multi-Radios or frequency range channel allotment and continually synchronized TDMA scheduling are shown in summarized way.Keywords: TDMA, MAC, multi-channel, multi-radio, WMN’S, cognitive radios
Procedia PDF Downloads 56211130 Collective Problem Solving: Tackling Obstacles and Unlocking Opportunities for Young People Not in Education, Employment, or Training
Authors: Kalimah Ibrahiim, Israa Elmousa
Abstract:
This study employed the world café method alongside semi-structured interviews within a 'conversation café' setting to engage stakeholders from the public health and primary care sectors. The objective was to collaboratively explore strategies to improve outcomes for young people not in education, employment, or training (NEET). The discussions were aimed at identifying the underlying causes of disparities faced by NEET individuals, exchanging experiences, and formulating community-driven solutions to bolster preventive efforts and shape policy initiatives. A thematic analysis of the qualitative data gathered emphasized the importance of community problem-solving through the exchange of ideas and reflective discussions. Healthcare professionals reflected on their potential roles, pinpointing a significant gap in understanding the specific needs of the NEET population and the unclear distribution of responsibilities among stakeholders. The results underscore the necessity for a unified approach in primary care and the fostering of multi-agency collaborations that focus on addressing social determinants of health. Such strategies are critical not only for the immediate improvement of health outcomes for NEET individuals but also for informing broader policy decisions that can have long-term benefits. Further research is ongoing, delving deeper into the unique challenges faced by this demographic and striving to develop more effective interventions. The study advocates for continued efforts to integrate insights from various sectors to create a more holistic and effective response to the needs of the NEET population, ensuring that future strategies are informed by a comprehensive understanding of their circumstances and challenges.Keywords: multi-agency working, primary care, public health, social inequalities
Procedia PDF Downloads 4111129 A Cross-Cultural Investigation of Self-Compassion in Adolescents Across Gender
Authors: H. N. Cheung
Abstract:
Self-compassion encourages one to accept oneself, reduce self-criticism and self-judgment, and see one’s shortcomings and setbacks in a balanced view. Adolescent self-compassion is a crucial protective factor against mental illness. It is, however, affected by gender. Given the scarcity of self-compassion scales for adolescents, the current study evaluates the Self-Compassion Scale for Youth (SCS-Y) in a large cross-cultural sample and investigates how the subscales of SCS-Y relate to the dimensions of depressive symptoms across gender. Through the internet-based Qualtrics, a total of 2881 teenagers aged 12 to 18 years were recruited from Hong Kong (HK), China, and the United Kingdom. A Multiple Indicator Multiple Cause (MIMIC) model was used to evaluate measurement invariance of the SCS-Y, and differential item functioning (DIF) was checked across gender. Upon the establishment of the best model, a multigroup structural equation model (SEM) was built between factors of SCS-Y and Multidimensional depression assessment scale (MDAS) which assesses four dimensions of depressive symptoms (emotional, cognitive, somatic and interpersonal). The SCS-Y was shown to have good reliability and validity. The MIMIC model produced a good model fit for a hypothetical six-factor model (CFI = 0.980; TLI = 0.974; RMSEA = 0.038) and no item was flagged for DIF across gender. A gender difference was observed between SCS-Y factors and depression dimensions. Conclusions: The SCS-Y exhibits good psychometric characteristics, including measurement invariance across gender. The study also highlights the gender difference between self-compassion factors and depression dimensions.Keywords: self compassion, gender, depression, structural equation modelling, MIMIC model
Procedia PDF Downloads 7211128 Understanding Cognitive Fatigue From FMRI Scans With Self-supervised Learning
Authors: Ashish Jaiswal, Ashwin Ramesh Babu, Mohammad Zaki Zadeh, Fillia Makedon, Glenn Wylie
Abstract:
Functional magnetic resonance imaging (fMRI) is a neuroimaging technique that records neural activations in the brain by capturing the blood oxygen level in different regions based on the task performed by a subject. Given fMRI data, the problem of predicting the state of cognitive fatigue in a person has not been investigated to its full extent. This paper proposes tackling this issue as a multi-class classification problem by dividing the state of cognitive fatigue into six different levels, ranging from no-fatigue to extreme fatigue conditions. We built a spatio-temporal model that uses convolutional neural networks (CNN) for spatial feature extraction and a long short-term memory (LSTM) network for temporal modeling of 4D fMRI scans. We also applied a self-supervised method called MoCo (Momentum Contrast) to pre-train our model on a public dataset BOLD5000 and fine-tuned it on our labeled dataset to predict cognitive fatigue. Our novel dataset contains fMRI scans from Traumatic Brain Injury (TBI) patients and healthy controls (HCs) while performing a series of N-back cognitive tasks. This method establishes a state-of-the-art technique to analyze cognitive fatigue from fMRI data and beats previous approaches to solve this problem.Keywords: fMRI, brain imaging, deep learning, self-supervised learning, contrastive learning, cognitive fatigue
Procedia PDF Downloads 19111127 Creation of GaxCo1-xZnSe0.4 (x = 0.1, 0.3, 0.5) Nanoparticles Using Pulse Laser Ablation Method
Authors: Yong Pan, Li Wang, Xue Qiong Su, Dong Wen Gao
Abstract:
To date, nanomaterials have received extensive attention over the years because of their wide application. Various nanomaterials such as nanoparticles, nanowire, nanoring, nanostars and other nanostructures have begun to be systematically studied. The preparation of these materials by chemical methods is not only costly, but also has a long cycle and high toxicity. At the same time, preparation of nanoparticles of multi-doped composites has been limited due to the special structure of the materials. In order to prepare multi-doped composites with the same structure as macro-materials and simplify the preparation method, the GaxCo1-xZnSe0.4 (x = 0.1, 0.3, 0.5) nanoparticles are prepared by Pulse Laser Ablation (PLA) method. The particle component and structure are systematically investigated by X-ray diffraction (XRD) and Raman spectra, which show that the success of our preparation and the same concentration between nanoparticles (NPs) and target. Morphology of the NPs characterized by Transmission Electron Microscopy (TEM) indicates the circular-shaped particles in preparation. Fluorescence properties are reflected by PL spectra, which demonstrate the best performance in concentration of Ga0.3Co0.3ZnSe0.4. Therefore, all the results suggest that PLA is promising to prepare the multi-NPs since it can modulate performance of NPs.Keywords: PLA, physics, nanoparticles, multi-doped
Procedia PDF Downloads 171