Search results for: SPSS programming
1893 Drawing Building Blocks in Existing Neighborhoods: An Automated Pilot Tool for an Initial Approach Using GIS and Python
Authors: Konstantinos Pikos, Dimitrios Kaimaris
Abstract:
Although designing building blocks is a procedure used by many planners around the world, there isn’t an automated tool that will help planners and designers achieve their goals with lesser effort. The difficulty of the subject lies in the repeating process of manually drawing lines, while not only it is mandatory to maintain the desirable offset but to also achieve a lesser impact to the existing building stock. In this paper, using Geographical Information Systems (GIS) and the Python programming language, an automated tool integrated into ArcGIS PRO, is being presented. Despite its simplistic enviroment and the lack of specialized building legislation due to the complex state of the field, a planner who is aware of such technical information can use the tool to draw an initial approach of the final building blocks in an area with pre-existing buildings in an attempt to organize the usually sprawling suburbs of a city or any continuously developing area. The tool uses ESRI’s ArcPy library to handle the spatial data, while interactions with the user is made throught Tkinter. The main process consists of a modification of building edgescoordinates, using NumPy library, in an effort to draw the line of best fit, so the user can get the optimal results per block’s side. Finally, after the tool runs successfully, a table of primary planning information is shown, such as the area of the building block and its coverage rate. Regardless of the primary stage of the tool’s development, it is a solid base where potential planners with programming skills could invest, so they can make the tool adapt to their individual needs. An example of the entire procedure in a test area is provided, highlighting both the strengths and weaknesses of the final results.Keywords: arcPy, GIS, python, building blocks
Procedia PDF Downloads 1771892 Electronic Payment Recording with Payment History Retrieval Module: A System Software
Authors: Adrian Forca, Simeon Cainday III
Abstract:
The Electronic Payment Recording with Payment History Retrieval Module is developed intendedly for the College of Science and Technology. This system software innovates the manual process of recording the payments done in the department through the development of electronic payment recording system software shifting from the slow and time-consuming procedure to quick yet reliable and accurate way of recording payments because it immediately generates receipts for every transaction. As an added feature to its software process, generation of recorded payment report is integrated eliminating the manual reporting to a more easy and consolidated report. As an added feature to the system, all recorded payments of the students can be retrieved immediately making the system transparent and reliable payment recording software. Viewing the whole process, the system software will shift from the manual process to an organized software technology because the information will be stored in a logically correct and normalized database. Further, the software will be developed using the modern programming language and implement strict programming methods to validate all users accessing the system, evaluate all data passed into the system and information retrieved to ensure data accuracy and reliability. In addition, the system will identify the user and limit its access privilege to establish boundaries of the specific access to information allowed for the store, modify, and update making the information secure against unauthorized data manipulation. As a result, the System software will eliminate the manual procedure and replace with an innovative modern information technology resulting to the improvement of the whole process of payment recording fast, secure, accurate and reliable software innovations.Keywords: collection, information system, manual procedure, payment
Procedia PDF Downloads 1621891 Integer Programming: Domain Transformation in Nurse Scheduling Problem.
Authors: Geetha Baskaran, Andrzej Barjiela, Rong Qu
Abstract:
Motivation: Nurse scheduling is a complex combinatorial optimization problem. It is also known as NP-hard. It needs an efficient re-scheduling to minimize some trade-off of the measures of violation by reducing selected constraints to soft constraints with measurements of their violations. Problem Statement: In this paper, we extend our novel approach to solve the nurse scheduling problem by transforming it through Information Granulation. Approach: This approach satisfies the rules of a typical hospital environment based on a standard benchmark problem. Generating good work schedules has a great influence on nurses' working conditions which are strongly related to the level of a quality health care. Domain transformation that combines the strengths of operation research and artificial intelligence was proposed for the solution of the problem. Compared to conventional methods, our approach involves judicious grouping (information granulation) of shifts types’ that transforms the original problem into a smaller solution domain. Later these schedules from the smaller problem domain are converted back into the original problem domain by taking into account the constraints that could not be represented in the smaller domain. An Integer Programming (IP) package is used to solve the transformed scheduling problem by expending the branch and bound algorithm. We have used the GNU Octave for Windows to solve this problem. Results: The scheduling problem has been solved in the proposed formalism resulting in a high quality schedule. Conclusion: Domain transformation represents departure from a conventional one-shift-at-a-time scheduling approach. It offers an advantage of efficient and easily understandable solutions as well as offering deterministic reproducibility of the results. We note, however, that it does not guarantee the global optimum.Keywords: domain transformation, nurse scheduling, information granulation, artificial intelligence, simulation
Procedia PDF Downloads 3951890 Multi-Objective Multi-Period Allocation of Temporary Earthquake Disaster Response Facilities with Multi-Commodities
Authors: Abolghasem Yousefi-Babadi, Ali Bozorgi-Amiri, Aida Kazempour, Reza Tavakkoli-Moghaddam, Maryam Irani
Abstract:
All over the world, natural disasters (e.g., earthquakes, floods, volcanoes and hurricanes) causes a lot of deaths. Earthquakes are introduced as catastrophic events, which is accident by unusual phenomena leading to much loss around the world. Such could be replaced by disasters or any other synonyms strongly demand great long-term help and relief, which can be hard to be managed. Supplies and facilities are very important challenges after any earthquake which should be prepared for the disaster regions to satisfy the people's demands who are suffering from earthquake. This paper proposed disaster response facility allocation problem for disaster relief operations as a mathematical programming model. Not only damaged people in the earthquake victims, need the consumable commodities (e.g., food and water), but also they need non-consumable commodities (e.g., clothes) to protect themselves. Therefore, it is concluded that paying attention to disaster points and people's demands are very necessary. To deal with this objective, both commodities including consumable and need non-consumable commodities are considered in the presented model. This paper presented the multi-objective multi-period mathematical programming model regarding the minimizing the average of the weighted response times and minimizing the total operational cost and penalty costs of unmet demand and unused commodities simultaneously. Furthermore, a Chebycheff multi-objective solution procedure as a powerful solution algorithm is applied to solve the proposed model. Finally, to illustrate the model applicability, a case study of the Tehran earthquake is studied, also to show model validation a sensitivity analysis is carried out.Keywords: facility location, multi-objective model, disaster response, commodity
Procedia PDF Downloads 2571889 Connecting the Dots: Bridging Academia and National Community Partnerships When Delivering Healthy Relationships Programming
Authors: Nicole Vlasman, Karamjeet Dhillon
Abstract:
Over the past four years, the Healthy Relationships Program has been delivered in community organizations and schools across Canada. More than 240 groups have been facilitated in collaboration with 33 organizations. As a result, 2157 youth have been engaged in the programming. The purpose and scope of the Healthy Relationships Program are to offer sustainable, evidence-based skills through small group implementation to prevent violence and promote positive, healthy relationships in youth. The program development has included extensive networking at regional and national levels. The Healthy Relationships Program is currently being implemented, adapted, and researched within the Resilience and Inclusion through Strengthening and Enhancing Relationships (RISE-R) project. Alongside the project’s research objectives, the RISE-R team has worked to virtually share the ongoing findings of the project through a slow ontology approach. Slow ontology is a practice integrated into project systems and structures whereby slowing the pace and volume of outputs offers creative opportunities. Creative production reveals different layers of success and complements the project, the building blocks for sustainability. As a result of integrating a slow ontology approach, the RISE-R team has developed a Geographic Information System (GIS) that documents local landscapes through a Story Map feature, and more specifically, video installations. Video installations capture the cartography of space and place within the context of singular diverse community spaces (case studies). By documenting spaces via human connections, the project captures narratives, which further enhance the voices and faces of the community within the larger project scope. This GIS project aims to create a visual and interactive flow of information that complements the project's mixed-method research approach. Conclusively, creative project development in the form of a geographic information system can provide learning and engagement opportunities at many levels (i.e., within community organizations and educational spaces or with the general public). In each of these disconnected spaces, fragmented stories are connected through a visual display of project outputs. A slow ontology practice within the context of the RISE-R project documents activities on the fringes and within internal structures; primarily through documenting project successes as further contributions to the Centre for School Mental Health framework (philosophy, recruitment techniques, allocation of resources and time, and a shared commitment to evidence-based products).Keywords: community programming, geographic information system, project development, project management, qualitative, slow ontology
Procedia PDF Downloads 1551888 Relationship Between Reading Comprehension and Achievement in Science Among Grade Eleven Bilingual Students in a Secondary School, Thailand
Authors: Simon Mauma Efange
Abstract:
The main aims of this research were to describe, in co-relational terms, the relationship, if any, between reading comprehension and academic achievement in science studied at the secondary level and, secondly, to find out possible trends in gender differences, such as whether boys would perform better than girls or vice versa. This research employed a quantitative design. Two kinds of instruments were employed: the Oxford Online Placement Test and the Local Assessment System Test. The Oxford Online Placement Test assesses students' English level quickly and easily. The results of these tests were subjected to statistical analysis using a special statistical software called SPSS. Statistical tools such as mean, standard deviation, percentages, frequencies, t-tests, and Pearson’s coefficient of correlation were used for the analysis of the results. Results of the t-test showed that the means are significantly different. Calculating the p-value revealed that the results were extremely statistically significant at p <.05. The value of r (Pearson correlation coefficient) was 0.2868. Although technically there is a positive correlation, the relationship between the variables is only weak (the closer the value is to zero, the weaker the relationship). However, in conclusion, calculations from the t-test using SPSS revealed that the results were statistically significant at p <.05, confirming a relationship between the two variables, and high scores in reading will give rise to slightly high scores in science. The research also revealed that having a high score in reading comprehension doesn’t necessarily mean having a high score in science or vice versa. Female subjects performed much better than male subjects in both tests, which is in line with the literature reviewed for this research.Keywords: achievement in science, achievement in English, and bilingual students, relationship
Procedia PDF Downloads 461887 Effect of Whole Body Vibration on Posture Stability and Planter Pressure in Patients with Diabetic Neuropathy
Authors: Azza M. Atya, Mahmoud M. Nasser
Abstract:
Background/ /Significance: Peripheral neuropathy is one of the long term serious complications of diabetes, which may attribute to postural instability and alteration of planter pressure. Whole body vibration (WBV) is a somatosensory stimulation type of exercise that has been emerged in sport training and rehabilitation of neuromuscular disorders. Purpose: The aim of this study was to investigate the effect of whole Body Vibration on antroposterior (AP), mediolateral (ML) posture stability and planter foot pressure in patients with diabetic neuropathy. Subjects: forty diabetic patients with moderate peripheral neuropathy aged from 35 to 50 years, were randomly assigned to WBV group (n=20) and control group (n=20). Methods and Materials: the WBV intervention consisted of three session weekly for 8 weeks (frequency 20 Hz, peak-to peak displacement 4mm, acceleration 3.5 g). Biodex balance system was used for postural stability assessment and the foot scan plate was used to measure the mean peak pressure under the first and lesser metatarsals. The main Outcome measures were antroposterior stability index (APSI), mediolateral stability index (MLSI), overall stability index (OSI),and mean peak foot pressure. Analyses: Statistical analysis was performed using the SPSS software package (SPSS for Windows Release 18.0). T-test was used to compare between the pre- and post-treatment values between and within groups. Results: For the 40 study participants (18male and 22 females) there were no between-group differences at baseline. At the end of 8 weeks, Subjects in WBV group experienced significant increase in postural stability with a reduction of mean peak of planter foot pressure (P<0.05) compared with the control group. Conclusion: The result suggests that WBV is an effective therapeutic modality for increasing postural stability and reducing planter pressure in patients with diabetic neuropathy.Keywords: whole body vibration, diabetic neuropathy, posture stability, foot pressure
Procedia PDF Downloads 3811886 Harmonizing Cities: Integrating Land Use Diversity and Multimodal Transit for Social Equity
Authors: Zi-Yan Chao
Abstract:
With the rapid development of urbanization and increasing demand for efficient transportation systems, the interaction between land use diversity and transportation resource allocation has become a critical issue in urban planning. Achieving a balance of land use types, such as residential, commercial, and industrial areas, is crucial role in ensuring social equity and sustainable urban development. Simultaneously, optimizing multimodal transportation networks, including bus, subway, and car routes, is essential for minimizing total travel time and costs, while ensuring fairness for all social groups, particularly in meeting the transportation needs of low-income populations. This study develops a bilevel programming model to address these challenges, with land use diversity as the foundation for measuring equity. The upper-level model maximizes land use diversity for balanced land distribution across regions. The lower-level model optimizes multimodal transportation networks to minimize travel time and costs while maintaining user equilibrium. The model also incorporates constraints to ensure fair resource allocation, such as balancing transportation accessibility and cost differences across various social groups. A solution approach is developed to solve the bilevel optimization problem, ensuring efficient exploration of the solution space for land use and transportation resource allocation. This study maximizes social equity by maximizing land use diversity and achieving user equilibrium with optimal transportation resource distribution. The proposed method provides a robust framework for addressing urban planning challenges, contributing to sustainable and equitable urban development.Keywords: bilevel programming model, genetic algorithms, land use diversity, multimodal transportation optimization, social equity
Procedia PDF Downloads 211885 Framework for Incorporating Environmental Performance in Network-Level Pavement Maintenance Program
Authors: Jessica Achebe, Susan Tighe
Abstract:
The reduction of material consumption and greenhouse gas emission when maintain and rehabilitating road networks can achieve added benefits including improved life cycle performance of pavements, reduced climate change impacts and human health effect due to less air pollution, improved productivity due to an optimal allocation of resources and reduced road user cost. This is the essence of incorporating environmental sustainability into pavement management. The functionality of performance measurement approach has made it one of the most valuable tool to Pavement Management Systems (PMSs) to account for different criteria in the decision-making process. However measuring the environmental performance of road network is still a far-fetched practice in road network management, more so an ostensive agency-wide environmental sustainability or sustainable maintenance specifications is missing. To address this challenge, this present research focuses on the environmental sustainability performance of network-level pavement management. The ultimate goal is to develop a framework to incorporate environmental sustainability in pavement management systems for network-level maintenance programming. In order to achieve this goal, this paper present the first step, the intention is to review the previous studies that employed environmental performance measures, as well as the suitability of environmental performance indicators for the evaluation of the sustainability of network-level pavement maintenance strategies. Through an industry practice survey, this paper provides a brief forward regarding the pavement manager motivations and barriers to making more sustainable decisions, and data needed to support the network-level environmental sustainability. The trends in network-level sustainable pavement management are also presented, existing gaps are highlighted, and ideas are proposed for network-level sustainable maintenance and rehabilitation programming.Keywords: pavement management, environment sustainability, network-level evaluation, performance measures
Procedia PDF Downloads 3051884 Optimisation Model for Maximising Social Sustainability in Construction Scheduling
Authors: Laura Florez
Abstract:
The construction industry is labour intensive, and the behaviour and management of workers have a direct impact on the performance of construction projects. One of the issues it currently faces is how to recruit and maintain its workers. Construction is known as an industry where workers face the problem of short employment durations, frequent layoffs, and periods of unemployment between jobs. These challenges not only creates pressures on the workers but also project managers have to constantly train new workers, face skills shortage, and uncertainty on the quality of the workers it will attract. To consider worker’s needs and project managers expectations, one practice that can be implemented is to schedule construction projects to maintain a stable workforce. This paper proposes a mixed integer programming (MIP) model to schedule projects with the objective of maximising social sustainability of construction projects, that is, maximise labour stability. Aside from the social objective, the model accounts for equipment and financial resources required by the projects during the construction phase. To illustrate how the solution strategy works, a construction programme comprised of ten projects is considered. The projects are scheduled to maximise labour stability while simultaneously minimising time and minimising cost. The tradeoff between the values in terms of time, cost, and labour stability allows project managers to consider their preferences and identify which solution best suits their needs. Additionally, the model determines the optimal starting times for each of the projects, working patterns for the workers, and labour costs. This model shows that construction projects can be scheduled to not only benefit the project manager, but also benefit current workers and help attract new workers to the industry. Due to its practicality, it can be a valuable tool to support decision making and assist construction stakeholders when developing schedules that include social sustainability factors.Keywords: labour stability, mixed-integer programming (MIP), scheduling, workforce management
Procedia PDF Downloads 2521883 Budgetary Performance Model for Managing Pavement Maintenance
Authors: Vivek Hokam, Vishrut Landge
Abstract:
An ideal maintenance program for an industrial road network is one that would maintain all sections at a sufficiently high level of functional and structural conditions. However, due to various constraints such as budget, manpower and equipment, it is not possible to carry out maintenance on all the needy industrial road sections within a given planning period. A rational and systematic priority scheme needs to be employed to select and schedule industrial road sections for maintenance. Priority analysis is a multi-criteria process that determines the best ranking list of sections for maintenance based on several factors. In priority setting, difficult decisions are required to be made for selection of sections for maintenance. It is more important to repair a section with poor functional conditions which includes uncomfortable ride etc. or poor structural conditions i.e. sections those are in danger of becoming structurally unsound. It would seem therefore that any rational priority setting approach must consider the relative importance of functional and structural condition of the section. The maintenance priority index and pavement performance models tend to focus mainly on the pavement condition, traffic criteria etc. There is a need to develop the model which is suitably used with respect to limited budget provisions for maintenance of pavement. Linear programming is one of the most popular and widely used quantitative techniques. A linear programming model provides an efficient method for determining an optimal decision chosen from a large number of possible decisions. The optimum decision is one that meets a specified objective of management, subject to various constraints and restrictions. The objective is mainly minimization of maintenance cost of roads in industrial area. In order to determine the objective function for analysis of distress model it is necessary to fix the realistic data into a formulation. Each type of repair is to be quantified in a number of stretches by considering 1000 m as one stretch. A stretch considered under study is having 3750 m length. The quantity has to be put into an objective function for maximizing the number of repairs in a stretch related to quantity. The distress observed in this stretch are potholes, surface cracks, rutting and ravelling. The distress data is measured manually by observing each distress level on a stretch of 1000 m. The maintenance and rehabilitation measured that are followed currently are based on subjective judgments. Hence, there is a need to adopt a scientific approach in order to effectively use the limited resources. It is also necessary to determine the pavement performance and deterioration prediction relationship with more accurate and economic benefits of road networks with respect to vehicle operating cost. The infrastructure of road network should have best results expected from available funds. In this paper objective function for distress model is determined by linear programming and deterioration model considering overloading is discussed.Keywords: budget, maintenance, deterioration, priority
Procedia PDF Downloads 2071882 Recycling Service Strategy by Considering Demand-Supply Interaction
Authors: Hui-Chieh Li
Abstract:
Circular economy promotes greater resource productivity and avoids pollution through greater recycling and re-use which bring benefits for both the environment and the economy. The concept is contrast to a linear economy which is ‘take, make, dispose’ model of production. A well-design reverse logistics service strategy could enhance the willingness of recycling of the users and reduce the related logistics cost as well as carbon emissions. Moreover, the recycle brings the manufacturers most advantages as it targets components for closed-loop reuse, essentially converting materials and components from worn-out product into inputs for new ones at right time and right place. This study considers demand-supply interaction, time-dependent recycle demand, time-dependent surplus value of recycled product and constructs models on recycle service strategy for the recyclable waste collector. A crucial factor in optimizing a recycle service strategy is consumer demand. The study considers the relationships between consumer demand towards recycle and product characteristics, surplus value and user behavior. The study proposes a recycle service strategy which differs significantly from the conventional and typical uniform service strategy. Periods with considerable demand and large surplus product value suggest frequent and short service cycle. The study explores how to determine a recycle service strategy for recyclable waste collector in terms of service cycle frequency and duration and vehicle type for all service cycles by considering surplus value of recycled product, time-dependent demand, transportation economies and demand-supply interaction. The recyclable waste collector is responsible for the collection of waste product for the manufacturer. The study also examines the impacts of utilization rate on the cost and profit in the context of different sizes of vehicles. The model applies mathematical programming methods and attempts to maximize the total profit of the distributor during the study period. This study applies the binary logit model, analytical model and mathematical programming methods to the problem. The model specifically explores how to determine a recycle service strategy for the recycler by considering product surplus value, time-dependent recycle demand, transportation economies and demand-supply interaction. The model applies mathematical programming methods and attempts to minimize the total logistics cost of the recycler and maximize the recycle benefits of the manufacturer during the study period. The study relaxes the constant demand assumption and examines how service strategy affects consumer demand towards waste recycling. Results of the study not only help understanding how the user demand for recycle service and product surplus value affects the logistics cost and manufacturer’s benefits, but also provide guidance such as award bonus and carbon emission regulations for the government.Keywords: circular economy, consumer demand, product surplus value, recycle service strategy
Procedia PDF Downloads 3911881 Using Google Distance Matrix Application Programming Interface to Reveal and Handle Urban Road Congestion Hot Spots: A Case Study from Budapest
Authors: Peter Baji
Abstract:
In recent years, a growing body of literature emphasizes the increasingly negative impacts of urban road congestion in the everyday life of citizens. Although there are different responses from the public sector to decrease traffic congestion in urban regions, the most effective public intervention is using congestion charges. Because travel is an economic asset, its consumption can be controlled by extra taxes or prices effectively, but this demand-side intervention is often unpopular. Measuring traffic flows with the help of different methods has a long history in transport sciences, but until recently, there was not enough sufficient data for evaluating road traffic flow patterns on the scale of an entire road system of a larger urban area. European cities (e.g., London, Stockholm, Milan), in which congestion charges have already been introduced, designated a particular zone in their downtown for paying, but it protects only the users and inhabitants of the CBD (Central Business District) area. Through the use of Google Maps data as a resource for revealing urban road traffic flow patterns, this paper aims to provide a solution for a fairer and smarter congestion pricing method in cities. The case study area of the research contains three bordering districts of Budapest which are linked by one main road. The first district (5th) is the original downtown that is affected by the congestion charge plans of the city. The second district (13th) lies in the transition zone, and it has recently been transformed into a new CBD containing the biggest office zone in Budapest. The third district (4th) is a mainly residential type of area on the outskirts of the city. The raw data of the research was collected with the help of Google’s Distance Matrix API (Application Programming Interface) which provides future estimated traffic data via travel times between freely fixed coordinate pairs. From the difference of free flow and congested travel time data, the daily congestion patterns and hot spots are detectable in all measured roads within the area. The results suggest that the distribution of congestion peak times and hot spots are uneven in the examined area; however, there are frequently congested areas which lie outside the downtown and their inhabitants also need some protection. The conclusion of this case study is that cities can develop a real-time and place-based congestion charge system that forces car users to avoid frequently congested roads by changing their routes or travel modes. This would be a fairer solution for decreasing the negative environmental effects of the urban road transportation instead of protecting a very limited downtown area.Keywords: Budapest, congestion charge, distance matrix API, application programming interface, pilot study
Procedia PDF Downloads 1941880 Utilization of Family Planning Methods and Associated Factors among Women of Reproductive Age Group in Sunsari, Nepal
Authors: Punam Kumari Mandal, Namita Yangden, Bhumika Rai, Achala Niraula, Sabitra Subedi
Abstract:
introduction: Family planning not only improves women’s health but also promotes gender equality, better child health, and improved education outcomes, including poverty reduction. The objective of this study is to assess the utilization of family planning methods and associated factors in Sunsari, Nepal. methodology: A cross-sectional analytical study was conducted among women of the reproductive age group (15-49 years) in Sunsari in 2020. Nonprobability purposive sampling was used to collect information from 212 respondents through face-to-face interviews using a Semi-structured interview schedule from ward no 1 of Barju rural municipality. Data processing was done by using SPSS “statistics for windows, version 17.0(SPSS Inc., Chicago, III.USA”). Descriptive analysis and inferential analysis (binary logistic regression) were used to find the association of the utilization of family planning methods with selected demographic variables. All the variables with P-value <0.1 in bivariate analysis were included in multivariate analysis. A P-value of <0.05 was considered to indicate statistical significance at a level of significance of 5%. results: This study showed that the mean age and standard deviation of the respondents were 26±7.03, and 91.5 % of respondent’s age at marriage was less than 20 years. Likewise, 67.5% of respondents use any methods of family planning, and 55.2% of respondents use family planning services from the government health facility. Furthermore, education (AOR 1.579, CI 1.013-2.462)., husband’s occupation (AOR 1.095, CI 0.744-1.610)., type of family (AOR 2.741, CI 1.210-6.210)., and no of living son (AOR 0.259 CI 0.077-0.872)are the factors associated with the utilization of family planning methods. conclusion: This study concludes that two-thirds of reproductive-age women utilize family planning methods. Furthermore, education, the husband’s occupation, the type of family, and no of living sons are the factors associated with the utilization of family planning methods. This reflects that awareness through mass media, including behavioral communication, is needed to increase the utilization of family planning methods.Keywords: family planning methods, utilization. factors, women, community
Procedia PDF Downloads 1341879 Building Tutor and Tutee Pedagogical Agents to Enhance Learning in Adaptive Educational Games
Authors: Ogar Ofut Tumenayu, Olga Shabalina
Abstract:
This paper describes the application of two types of pedagogical agents’ technology with different functions in an adaptive educational game with the sole aim of improving learning and enhancing interactivities in Digital Educational Games (DEG). This idea could promote the elimination of some problems of DEG, like isolation in game-based learning, by introducing a tutor and tutee pedagogical agents. We present an analysis of a learning companion interacting in a peer tutoring environment as a step toward improving social interactions in the educational game environment. We show that tutor and tutee agents use different interventions and interactive approaches: the tutor agent is engaged in tracking the learner’s activities and inferring the learning state, while the tutee agent initiates interactions with the learner at the appropriate times and in appropriate manners. In order to provide motivation to prevent mistakes and clarity a game task, the tutor agent uses the help dialog tool to provide assistance, while the tutee agent provides collaboration assistance by using the hind tool. We presented our idea on a prototype game called “Pyramid Programming Game,” a 2D game that was developed using Libgdx. The game's Pyramid component symbolizes a programming task that is presented to the player in the form of a puzzle. During gameplay, the Agents can instruct, direct, inspire, and communicate emotions. They can also rapidly alter the instructional pattern in response to the learner's performance and knowledge. The pyramid must be effectively destroyed in order to win the game. The game also teaches and illustrates the advantages of utilizing educational agents such as TrA and TeA to assist and motivate students. Our findings support the idea that the functionality of a pedagogical agent should be dualized into an instructional and learner’s companion agent in order to enhance interactivity in a game-based environment.Keywords: tutor agent, tutee agent, learner’s companion interaction, agent collaboration
Procedia PDF Downloads 651878 Preventing Violent Extremism in Mozambique and Tanzania: A Survey to Measure Community Resilience
Authors: L. Freeman, D. Bax, V. K. Sapong
Abstract:
Community-based, preventative approaches to violent extremism may be effective and yet remain an underutilised method. In a realm where security approaches dominate, with the focus on countering violence extremism and combatting radicalisation, community resilience programming remains sparse. This paper will present a survey tool that aims to measure the risk and protective factors that can lead to violent extremism in Mozambique and Tanzania. Conducted in four districts in the Cabo Delgado region of Mozambique and one district in Pwani, Tanzania, the survey uses a combination of BRAVE-14, Afrocentric and context-specific questions in order to more fully understand community resilience opportunities and challenges in preventing and countering violent extremism. Developed in Australia and Canada to measure radicalisation risks in individuals and communities, BRAVE-14 is a tool not yet applied in the African continent. Given the emerging threat of Islamic extremism in Northern Mozambique and Eastern Tanzania, which both experience a combination of socio-political exclusion, resource marginalisation and religious/ideological motivations, the development of the survey is timely and fills a much-needed information gap in these regions. Not only have these Islamist groups succeeded in tapping into the grievances of communities by radicalising and recruiting individuals, but their presence in these regions has been characterised by extreme forms of violence, leaving isolated communities vulnerable to attack. The expected result of these findings will facilitate the contextualisation and comparison of the protective and risk factors that inhibit or promote the radicalisation of the youth in these communities. In identifying sources of resilience and vulnerability, this study emphasises the implementation of context-specific intervention programming and provides a strong research tool for understanding youth and community resilience to violent extremism.Keywords: community resilience, Mozambique, preventing violent extremism, radicalisation, Tanzania
Procedia PDF Downloads 1301877 A Survey on Compression Methods for Table Constraints
Authors: N. Gharbi
Abstract:
Constraint Satisfaction problems are mathematical problems that are often used to model many real-world problems for which we look if there exists a solution satisfying all its constraints. Table constraints are important for modeling parts of many problems since they list all combinations of allowed or forbidden values. However, they admit practical limitations because they are sometimes too large to be represented in a direct way. In this paper, we present a survey of the different categories of the proposed approaches to compress table constraints in order to reduce both space and time complexities.Keywords: constraint programming, compression, data mining, table constraints
Procedia PDF Downloads 3221876 A Mathematical Model to Select Shipbrokers
Authors: Y. Smirlis, G. Koronakos, S. Plitsos
Abstract:
Shipbrokers assist the ship companies in chartering or selling and buying vessels, acting as intermediates between them and the market. They facilitate deals, providing their expertise, negotiating skills, and knowledge about ship market bargains. Their role is very important as it affects the profitability and market position of a shipping company. Due to their significant contribution, the shipping companies have to employ systematic procedures to evaluate the shipbrokers’ services in order to select the best and, consequently, to achieve the best deals. Towards this, in this paper, we consider shipbrokers as financial service providers, and we formulate the problem of evaluating and selecting shipbrokers’ services as a multi-criteria decision making (MCDM) procedure. The proposed methodology comprises a first normalization step to adjust different scales and orientations of the criteria and a second step that includes the mathematical model to evaluate the performance of the shipbrokers’ services involved in the assessment. The criteria along which the shipbrokers are assessed may refer to their size and reputation, the potential efficiency of the services, the terms and conditions imposed, the expenses (e.g., commission – brokerage), the expected time to accomplish a chartering or selling/buying task, etc. and according to our modelling approach these criteria may be assigned different importance. The mathematical programming model performs a comparative assessment and estimates for the shipbrokers involved in the evaluation, a relative score that ranks the shipbrokers in terms of their potential performance. To illustrate the proposed methodology, we present a case study in which a shipping company evaluates and selects the most suitable among a number of sale and purchase (S&P) brokers. Acknowledgment: This study is supported by the OptiShip project, implemented within the framework of the National Recovery Plan and Resilience “Greece 2.0” and funded by the European Union – NextGenerationEU programme.Keywords: shipbrokers, multi-criteria decision making, mathematical programming, service-provider selection
Procedia PDF Downloads 871875 Using Scilab® as New Introductory Method in Numerical Calculations and Programming for Computational Fluid Dynamics (CFD)
Authors: Nicoly Coelho, Eduardo Vieira Vilas Boas, Paulo Orestes Formigoni
Abstract:
Faced with the remarkable developments in the various segments of modern engineering, provided by the increasing technological development, professionals of all educational areas need to overcome the difficulties generated due to the good understanding of those who are starting their academic journey. Aiming to overcome these difficulties, this article aims at an introduction to the basic study of numerical methods applied to fluid mechanics and thermodynamics, demonstrating the modeling and simulations with its substance, and a detailed explanation of the fundamental numerical solution for the use of finite difference method, using SCILAB, a free software easily accessible as it is free and can be used for any research center or university, anywhere, both in developed and developing countries. It is known that the Computational Fluid Dynamics (CFD) is a necessary tool for engineers and professionals who study fluid mechanics, however, the teaching of this area of knowledge in undergraduate programs faced some difficulties due to software costs and the degree of difficulty of mathematical problems involved in this way the matter is treated only in postgraduate courses. This work aims to bring the use of DFC low cost in teaching Transport Phenomena for graduation analyzing a small classic case of fundamental thermodynamics with Scilab® program. The study starts from the basic theory involving the equation the partial differential equation governing heat transfer problem, implies the need for mastery of students, discretization processes that include the basic principles of series expansion Taylor responsible for generating a system capable of convergence check equations using the concepts of Sassenfeld, finally coming to be solved by Gauss-Seidel method. In this work we demonstrated processes involving both simple problems solved manually, as well as the complex problems that required computer implementation, for which we use a small algorithm with less than 200 lines in Scilab® in heat transfer study of a heated plate in rectangular shape on four sides with different temperatures on either side, producing a two-dimensional transport with colored graphic simulation. With the spread of computer technology, numerous programs have emerged requiring great researcher programming skills. Thinking that this ability to program DFC is the main problem to be overcome, both by students and by researchers, we present in this article a hint of use of programs with less complex interface, thus enabling less difficulty in producing graphical modeling and simulation for DFC with an extension of the programming area of experience for undergraduates.Keywords: numerical methods, finite difference method, heat transfer, Scilab
Procedia PDF Downloads 3851874 Optimal Tamping for Railway Tracks, Reducing Railway Maintenance Expenditures by the Use of Integer Programming
Authors: Rui Li, Min Wen, Kim Bang Salling
Abstract:
For the modern railways, maintenance is critical for ensuring safety, train punctuality and overall capacity utilization. The cost of railway maintenance in Europe is high, on average between 30,000 – 100,000 Euros per kilometer per year. In order to reduce such maintenance expenditures, this paper presents a mixed 0-1 linear mathematical model designed to optimize the predictive railway tamping activities for ballast track in the planning horizon of three to four years. The objective function is to minimize the tamping machine actual costs. The approach of the research is using the simple dynamic model for modelling condition-based tamping process and the solution method for finding optimal condition-based tamping schedule. Seven technical and practical aspects are taken into account to schedule tamping: (1) track degradation of the standard deviation of the longitudinal level over time; (2) track geometrical alignment; (3) track quality thresholds based on the train speed limits; (4) the dependency of the track quality recovery on the track quality after tamping operation; (5) Tamping machine operation practices (6) tamping budgets and (7) differentiating the open track from the station sections. A Danish railway track between Odense and Fredericia with 42.6 km of length is applied for a time period of three and four years in the proposed maintenance model. The generated tamping schedule is reasonable and robust. Based on the result from the Danish railway corridor, the total costs can be reduced significantly (50%) than the previous model which is based on optimizing the number of tamping. The different maintenance strategies have been discussed in the paper. The analysis from the results obtained from the model also shows a longer period of predictive tamping planning has more optimal scheduling of maintenance actions than continuous short term preventive maintenance, namely yearly condition-based planning.Keywords: integer programming, railway tamping, predictive maintenance model, preventive condition-based maintenance
Procedia PDF Downloads 4411873 Procedure to Optimize the Performance of Chemical Laser Using the Genetic Algorithm Optimizations
Authors: Mohammedi Ferhate
Abstract:
This work presents details of the study of the entire flow inside the facility where the exothermic chemical reaction process in the chemical laser cavity is analyzed. In our paper we will describe the principles of chemical lasers where flow reversal is produced by chemical reactions. We explain the device for converting chemical potential energy laser energy. We see that the phenomenon thus has an explosive trend. Finally, the feasibility and effectiveness of the proposed method is demonstrated by computer simulationKeywords: genetic, lasers, nozzle, programming
Procedia PDF Downloads 921872 Measurement of CES Production Functions Considering Energy as an Input
Authors: Donglan Zha, Jiansong Si
Abstract:
Because of its flexibility, CES attracts much interest in economic growth and programming models, and the macroeconomics or micro-macro models. This paper focuses on the development, estimating methods of CES production function considering energy as an input. We leave for future research work of relaxing the assumption of constant returns to scale, the introduction of potential input factors, and the generalization method of the optimal nested form of multi-factor production functions.Keywords: bias of technical change, CES production function, elasticity of substitution, energy input
Procedia PDF Downloads 2781871 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review
Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha
Abstract:
Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text
Procedia PDF Downloads 1151870 The Impact of the Atypical Crisis on Educational Migration: Economic and Policy Challenges
Authors: Manana Lobzhanidze, Marine Kobalava, Lali Chikviladze
Abstract:
The global pandemic crisis has had a significant impact on educational migration, substantially limiting young people’s access to education abroad. Therefore, it became necessary to study the economic, demographic, social, cultural and other factors associated with educational migration, to identify the economic and political challenges of educational migration and to develop recommendations. The aim of the research is to study the effects of the atypical crisis on educational migration and to make recommendations on effective migration opportunities based on the identification of economic and policy challenges in this area. Bibliographic research is used to assess the effects of the impact of the atypical crisis on educational migration presented in the papers of various scholars. Against the background of the restrictions imposed during the COVID19 pandemic, migration rates have been analyzed, endogenous and exogenous factors affecting educational migration have been identified. Quantitative and qualitative research of students and graduates of TSU Economics and Business Faculty is conducted, the results have been processed by SPSS program, the factors hindering educational migration and the challenges have been identified. The Internet and digital technologies have been shown to play a vital role in alleviating the challenges posed by the COVID-19 pandemic, however, lack of Internet access and limited financial resources have played a disruptive role in the educational migration process. The analysis of quantitative research materials revealed the problems of educational migration caused by the atypical crisis, while some issues were clarified during the focus group meetings. The following theoretical-methodological approaches were used during the research: a bibliographic research, analysis, synthesis, comparison, selection-grouping are used; Quantitative and qualitative research has been carried out, the results have been processed by SPSS program. The article presents the consequences of the atypical crisis for educational migration, identifies the main economic and policy challenges in the field of educational migration, and develops appropriate recommendations to overcome them.Keywords: educational migration, atypical crisis, economic-political challenges, educational migration factors
Procedia PDF Downloads 1431869 Tool for Determining the Similarity between Two Web Applications
Authors: Doru Anastasiu Popescu, Raducanu Dragos Ionut
Abstract:
In this paper the presentation of a tool which measures the similarity between two websites is made. The websites are compound only from webpages created with HTML. The tool uses three ways of calculating the similarity between two websites based on certain results already published. The first way compares all the webpages within a website, the second way compares a webpage with all the pages within the second website and the third way compares two webpages. Java programming language and technologies such as spring, Jsoup, log4j were used for the implementation of the tool.Keywords: Java, Jsoup, HTM, spring
Procedia PDF Downloads 3841868 Encephalon-An Implementation of a Handwritten Mathematical Expression Solver
Authors: Shreeyam, Ranjan Kumar Sah, Shivangi
Abstract:
Recognizing and solving handwritten mathematical expressions can be a challenging task, particularly when certain characters are segmented and classified. This project proposes a solution that uses Convolutional Neural Network (CNN) and image processing techniques to accurately solve various types of equations, including arithmetic, quadratic, and trigonometric equations, as well as logical operations like logical AND, OR, NOT, NAND, XOR, and NOR. The proposed solution also provides a graphical solution, allowing users to visualize equations and their solutions. In addition to equation solving, the platform, called CNNCalc, offers a comprehensive learning experience for students. It provides educational content, a quiz platform, and a coding platform for practicing programming skills in different languages like C, Python, and Java. This all-in-one solution makes the learning process engaging and enjoyable for students. The proposed methodology includes horizontal compact projection analysis and survey for segmentation and binarization, as well as connected component analysis and integrated connected component analysis for character classification. The compact projection algorithm compresses the horizontal projections to remove noise and obtain a clearer image, contributing to the accuracy of character segmentation. Experimental results demonstrate the effectiveness of the proposed solution in solving a wide range of mathematical equations. CNNCalc provides a powerful and user-friendly platform for solving equations, learning, and practicing programming skills. With its comprehensive features and accurate results, CNNCalc is poised to revolutionize the way students learn and solve mathematical equations. The platform utilizes a custom-designed Convolutional Neural Network (CNN) with image processing techniques to accurately recognize and classify symbols within handwritten equations. The compact projection algorithm effectively removes noise from horizontal projections, leading to clearer images and improved character segmentation. Experimental results demonstrate the accuracy and effectiveness of the proposed solution in solving a wide range of equations, including arithmetic, quadratic, trigonometric, and logical operations. CNNCalc features a user-friendly interface with a graphical representation of equations being solved, making it an interactive and engaging learning experience for users. The platform also includes tutorials, testing capabilities, and programming features in languages such as C, Python, and Java. Users can track their progress and work towards improving their skills. CNNCalc is poised to revolutionize the way students learn and solve mathematical equations with its comprehensive features and accurate results.Keywords: AL, ML, hand written equation solver, maths, computer, CNNCalc, convolutional neural networks
Procedia PDF Downloads 1211867 Producing Graphical User Interface from Activity Diagrams
Authors: Ebitisam K. Elberkawi, Mohamed M. Elammari
Abstract:
Graphical User Interface (GUI) is essential to programming, as is any other characteristic or feature, due to the fact that GUI components provide the fundamental interaction between the user and the program. Thus, we must give more interest to GUI during building and development of systems. Also, we must give a greater attention to the user who is the basic corner in the dealing with the GUI. This paper introduces an approach for designing GUI from one of the models of business workflows which describe the workflow behavior of a system, specifically through activity diagrams (AD).Keywords: activity diagram, graphical user interface, GUI components, program
Procedia PDF Downloads 4621866 Pavement Management for a Metropolitan Area: A Case Study of Montreal
Authors: Luis Amador Jimenez, Md. Shohel Amin
Abstract:
Pavement performance models are based on projections of observed traffic loads, which makes uncertain to study funding strategies in the long run if history does not repeat. Neural networks can be used to estimate deterioration rates but the learning rate and momentum have not been properly investigated, in addition, economic evolvement could change traffic flows. This study addresses both issues through a case study for roads of Montreal that simulates traffic for a period of 50 years and deals with the measurement error of the pavement deterioration model. Travel demand models are applied to simulate annual average daily traffic (AADT) every 5 years. Accumulated equivalent single axle loads (ESALs) are calculated from the predicted AADT and locally observed truck distributions combined with truck factors. A back propagation Neural Network (BPN) method with a Generalized Delta Rule (GDR) learning algorithm is applied to estimate pavement deterioration models capable of overcoming measurement errors. Linear programming of lifecycle optimization is applied to identify M&R strategies that ensure good pavement condition while minimizing the budget. It was found that CAD 150 million is the minimum annual budget to good condition for arterial and local roads in Montreal. Montreal drivers prefer the use of public transportation for work and education purposes. Vehicle traffic is expected to double within 50 years, ESALS are expected to double the number of ESALs every 15 years. Roads in the island of Montreal need to undergo a stabilization period for about 25 years, a steady state seems to be reached after.Keywords: pavement management system, traffic simulation, backpropagation neural network, performance modeling, measurement errors, linear programming, lifecycle optimization
Procedia PDF Downloads 4601865 A Sustainable Supplier Selection and Order Allocation Based on Manufacturing Processes and Product Tolerances: A Multi-Criteria Decision Making and Multi-Objective Optimization Approach
Authors: Ravi Patel, Krishna K. Krishnan
Abstract:
In global supply chains, appropriate and sustainable suppliers play a vital role in supply chain development and feasibility. In a larger organization with huge number of suppliers, it is necessary to divide suppliers based on their past history of quality and delivery of each product category. Since performance of any organization widely depends on their suppliers, well evaluated selection criteria and decision-making models lead to improved supplier assessment and development. In this paper, SCOR® performance evaluation approach and ISO standards are used to determine selection criteria for better utilization of supplier assessment by using hybrid model of Analytic Hierchchy Problem (AHP) and Fuzzy Techniques for Order Preference by Similarity to Ideal Solution (FTOPSIS). AHP is used to determine the global weightage of criteria which helps TOPSIS to get supplier score by using triangular fuzzy set theory. Both qualitative and quantitative criteria are taken into consideration for the proposed model. In addition, a multi-product and multi-time period model is selected for order allocation. The optimization model integrates multi-objective integer linear programming (MOILP) for order allocation and a hybrid approach for supplier selection. The proposed MOILP model optimizes order allocation based on manufacturing process and product tolerances as per manufacturer’s requirement for quality product. The integrated model and solution approach are tested to find optimized solutions for different scenario. The detailed analysis shows the superiority of proposed model over other solutions which considered individual decision making models.Keywords: AHP, fuzzy set theory, multi-criteria decision making, multi-objective integer linear programming, TOPSIS
Procedia PDF Downloads 1691864 Radiation Safety Factor of Education and Research Institution in Republic of Korea
Authors: Yeo Ryeong Jeon, Pyong Kon Cho, Eun Ok Han, Hyon Chul Jang, Yong Min Kim
Abstract:
This study surveyed on recognition related to radiation safety for radiation safety managers and workers those who have been worked in Republic of Korea education and research institution. At present, South Korea has no guideline and manual of radiation safety for education and research institution. Therefore, we tried to find an educational basis for development of radiation safety guideline and manual. To check the level of knowledge, attitude, and behavior about radiation safety, we used the questionnaire that consisted of 29 questions against knowledge, attitude and behavior, 4 questions against self-efficacy and expectation based on four factors (radiation source, human, organizational and physical environment) of the Haddon's matrix. Responses were collected between May 4 and June 30, 2015. We analyzed questionnaire by means of IBM SPSS/WIN 15 which well known as statistical package for social science. The data were compared with mean, standard deviation, Pearson's correlation, ANOVA (analysis of variance) and regression analysis. 180 copies of the questionnaire were returned from 60 workplaces. The overall mean results for behavior level was relatively lower than knowledge and attitude level. In particular, organizational environment factor on the radiation safety management indicated the lowest behavior level. Most of the factors were correlated in Pearson’s correlation analysis, especially between knowledge of human factors and behavior of human factors (Pearson’s correlation coefficient 0.809, P<.01). When analysis performed in line with the main radiation source type, institutions where have been used only opened RI (radioisotope) behavior level was the lowest among all subjects. Finally, knowledge of radiation source factor (β=0.556, P<.001) and human factor(β=0.376, P<.001) had the greatest impact in terms of behavior practice. Radiation safety managers and workers think positively about radiation safety management, but are poorly informed organizational environment of their institution. Thus, each institution need to efforts to settlement of radiation safety culture. Also, pedagogical interventions for improving knowledge on radiation safety needs in terms of safety accident prevention.Keywords: radiation safety management, factor analysis, SPSS, republic of Korea
Procedia PDF Downloads 363