Search results for: map optimization tool
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7850

Search results for: map optimization tool

3710 Lesbian Stereotype Representation in Cinema in Turkey

Authors: Hasan Gürkan, Rengin Ozan

Abstract:

Cinema, as a popular mass media tool, affects the general perception of the society against sexual identity. By establishing on interaction relationship with cinema and social reality, the study also tries to answer what the importance of lesbian identity in social life in films in Turkey is. This article focus on representing the description of the women characters who call their selves lesbian in Turkey cinema. The study tries to answer these three questions: First, how the lesbian characters are represented in films in Turkey? Second, what is the reality of the lesbian sexual identity in the films? Third, what are the differences and similarities between the lesbian characters in films in Turkey before 2000s and after 2000s? The films are analysed by the sociological film interpretation in this study. When comparing the films before 2000 and after 2000, it is possible to say that there have been no lesbian characters in many films. Especially almost all of the films (Haremde Dört Kadın, Ver Elini İstanbul, Dul Bir Kadın, Gramofon Avrat, Lola and Billidikid), during 1960s, just threw looks indirect the lesbian sex identity. Just in the films Düş Gezginleri, İki Genç Kız and Nar, the women character (also called them as lesbian) are the leading role and the plot of the films is progressing over these characters.

Keywords: cinema in Turkey, lesbian identity, representation, stereotype

Procedia PDF Downloads 331
3709 Optimization of Sequential Thermophilic Bio-Hydrogen/Methane Production from Mono-Ethylene Glycol via Anaerobic Digestion: Impact of Inoculum to Substrate Ratio and N/P Ratio

Authors: Ahmed Elreedy, Ahmed Tawfik

Abstract:

This investigation aims to assess the effect of inoculum to substrate ratio (ISR) and nitrogen to phosphorous balance on simultaneous biohydrogen and methane production from anaerobic decomposition of mono-ethylene glycol (MEG). Different ISRs were applied in the range between 2.65 and 13.23 gVSS/gCOD, whereas the tested N/P ratios were changed from 4.6 to 8.5; both under thermophilic conditions (55°C). The maximum obtained methane and hydrogen yields (MY and HY) of 151.86±10.8 and 22.27±1.1 mL/gCODinitial were recorded at ISRs of 5.29 and 3.78 gVSS/gCOD, respectively. Unlikely, the ammonification process, in terms of net ammonia produced, was found to be ISR and COD/N ratio dependent, reaching its peak value of 515.5±31.05 mgNH4-N/L at ISR and COD/N ratio of 13.23 gVSS/gCOD and 11.56. The optimum HY was enhanced by more than 1.45-fold with declining N/P ratio from 8.5 to 4.6; whereas, the MY was improved (1.6-fold), while increasing N/P ratio from 4.6 to 5.5 with no significant impact at N/P ratio of 8.5. The results obtained revealed that the methane production was strongly influenced by initial ammonia, compared to initial phosphate. Likewise, the generation of ammonia was markedly deteriorated from 535.25±41.5 to 238.33±17.6 mgNH4-N/L with increasing N/P ratio from 4.6 to 8.5. The kinetic study using Modified Gompertz equation was successfully fitted to the experimental outputs (R2 > 0.9761).

Keywords: mono-ethylene glycol, biohydrogen and methane, inoculum to substrate ratio, nitrogen to phosphorous balance, ammonification

Procedia PDF Downloads 378
3708 Designing an Online Case-Based Library for Technology Integration in Teacher Education

Authors: Mustafa Tevfik Hebebci, Sirin Kucuk, Ismail Celik, A. Oguz Akturk, Ismail Sahin, Fetah Eren

Abstract:

The purpose of this paper is to introduce an interactive online case-study library website developed in a national project. The design goal of the website is to provide interactive, enhanced, case-based and online educational resource for educators through the purpose and within the scope of a national project. The ADDIE instructional design model was used in the development of the website for interactive case-based library. This library is developed on a web-based platform, which is important in terms of manageability, accessibility, and updateability of data. Users are able to sort the displayed case-studies by their titles, dates, ratings, view counts, etc. The usability test is used and the expert opinion is taken for the evaluation of the website. This website is a tool to integrate technology into education. It is believed that this website will be beneficial for pre-service and in-service teachers in terms of their professional developments.

Keywords: ADDIE, case-based library, design, technology integration

Procedia PDF Downloads 439
3707 Selection of Solid Waste Landfill Site Using Geographical Information System (GIS)

Authors: Fatih Iscan, Ceren Yagci

Abstract:

Rapid population growth, urbanization and industrialization are known as the most important factors of environment problems. Elimination and management of solid wastes are also within the most important environment problems. One of the main problems in solid waste management is the selection of the best site for elimination of solid wastes. Lately, Geographical Information System (GIS) has been used for easing selection of landfill area. GIS has the ability of imitating necessary economical, environmental and political limitations. They play an important role for the site selection of landfill area as a decision support tool. In this study; map layers will be studied for minimum effect of environmental, social and cultural factors and maximum effect for engineering/economical factors for site selection of landfill areas and using GIS for an decision support mechanism in solid waste landfill areas site selection will be presented in Aksaray/TURKEY city, Güzelyurt district practice.

Keywords: GIS, landfill, solid waste, spatial analysis

Procedia PDF Downloads 354
3706 Creative Thinking through Mindful Practices: A Business Class Case Study

Authors: Malavika Sundararajan

Abstract:

This study introduces the use of mindfulness techniques in the classroom to make individuals aware of how the creative thinking process works, resulting in more constructive learning and application. Case observation method was utilized within a classroom setting in a graduate class in the Business School. It entailed, briefing the student participants about the use of a template called the dots and depths map, and having them complete it for themselves, compare it to their team members and reflect on the outputs. Finally, they were debriefed about the use of the template and its value to their learning and creative application process. The major finding is the increase in awareness levels of the participants following the use of the template, leading to a subsequent pursuit of diverse knowledge and acquisition of relevant information and not jumping to solutions directly, which increased their overall creative outputs for the given assignment. The significant value of this study is that it can be applied to any classroom on any subject as a powerful mindfulness tool which increases creative problem solving through constructive knowledge building.

Keywords: connecting dots, mindful awareness, constructive knowledge building, learning creatively

Procedia PDF Downloads 142
3705 V0 Physics at LHCb. RIVET Analysis Module for Z Boson Decay to Di-Electron

Authors: A. E. Dumitriu

Abstract:

The LHCb experiment is situated at one of the four points around CERN’s Large Hadron Collider, being a single-arm forward spectrometer covering 10 mrad to 300 (250) mrad in the bending (non-bending) plane, designed primarily to study particles containing b and c quarks. Each one of LHCb’s sub-detectors specializes in measuring a different characteristic of the particles produced by colliding protons, its significant detection characteristics including a high precision tracking system and 2 ring-imaging Cherenkov detectors for particle identification. The major two topics that I am currently concerned in are: the RIVET project (Robust Independent Validation of Experiment and Theory) which is an efficient and portable tool kit of C++ class library useful for validation and tuning of Monte Carlo (MC) event generator models by providing a large collection of standard experimental analyses useful for High Energy Physics MC generator development, validation, tuning and regression testing and V0 analysis for 2013 LHCb NoBias type data (trigger on bunch + bunch crossing) at √s=2.76 TeV.

Keywords: LHCb physics, RIVET plug-in, RIVET, CERN

Procedia PDF Downloads 420
3704 Evaluating the Possibility of Expanding National Health Insurance Funding From Zakat, Sudan

Authors: Fawzia Mohammed Idris

Abstract:

Zakat is an Islamic procedure for wealth distribution as a social protection mechanism for needy people. This study aimed to assess the possibility to expand the share of fund for national health insurance fund from zakat funds allocated for poor people by measuring the reduction of poverty that result from the investing on direct payment to the needy or by covering them in social health insurance. This study used stata regression as a statistical analysis tool and the finding clarified that there is no significant relationship between the poverty rate as the main indicator and, the number of poor people covered by national health insurance on one hand and the number of benefits poor people from the distribution of zakat fund. This study experienced many difficulties regarding the quality and the consistency of the data. The study suggested that a joint mission between national health insurance fund and zakat chamber to conduct study to assess the efficient use of zakat fund allocated to poor people.

Keywords: health finance, poverty, social health insurance, zakat

Procedia PDF Downloads 140
3703 Effect of Progressive Type-I Right Censoring on Bayesian Statistical Inference of Simple Step–Stress Acceleration Life Testing Plan under Weibull Life Distribution

Authors: Saleem Z. Ramadan

Abstract:

This paper discusses the effects of using progressive Type-I right censoring on the design of the Simple Step Accelerated Life testing using Bayesian approach for Weibull life products under the assumption of cumulative exposure model. The optimization criterion used in this paper is to minimize the expected pre-posterior variance of the PTH percentile time of failures. The model variables are the stress changing time and the stress value for the first step. A comparison between the conventional and the progressive Type-I right censoring is provided. The results have shown that the progressive Type-I right censoring reduces the cost of testing on the expense of the test precision when the sample size is small. Moreover, the results have shown that using strong priors or large sample size reduces the sensitivity of the test precision to the censoring proportion. Hence, the progressive Type-I right censoring is recommended in these cases as progressive Type-I right censoring reduces the cost of the test and doesn't affect the precision of the test a lot. Moreover, the results have shown that using direct or indirect priors affects the precision of the test.

Keywords: reliability, accelerated life testing, cumulative exposure model, Bayesian estimation, progressive type-I censoring, Weibull distribution

Procedia PDF Downloads 500
3702 Modeling of Building a Conceptual Scheme for Multimodal Freight Transportation Information System

Authors: Gia Surguladze, Nino Topuria, Lily Petriashvili, Giorgi Surguladze

Abstract:

Modeling of building processes of a multimodal freight transportation support information system is discussed based on modern CASE technologies. Functional efficiencies of ports in the eastern part of the Black Sea are analyzed taking into account their ecological, seasonal, resource usage parameters. By resources, we mean capacities of berths, cranes, automotive transport, as well as work crews and neighbouring airports. For the purpose of designing database of computer support system for Managerial (Logistics) function, using Object-Role Modeling (ORM) tool (NORMA – Natural ORM Architecture) is proposed, after which Entity Relationship Model (ERM) is generated in automated process. The software is developed based on Process-Oriented and Service-Oriented architecture, in Visual Studio.NET environment.

Keywords: seaport resources, business-processes, multimodal transportation, CASE technology, object-role model, entity relationship model, SOA

Procedia PDF Downloads 425
3701 Chaos Fuzzy Genetic Algorithm

Authors: Mohammad Jalali Varnamkhasti

Abstract:

The genetic algorithms have been very successful in handling difficult optimization problems. The fundamental problem in genetic algorithms is premature convergence. This paper, present a new fuzzy genetic algorithm based on chaotic values instead of the random values in genetic algorithm processes. In this algorithm, for initial population is used chaotic sequences and then a new sexual selection proposed for selection mechanism. In this technique, the population is divided such that the male and female would be selected in an alternate way. The layout of the male and female chromosomes in each generation is different. A female chromosome is selected by tournament selection size from the female group. Then, the male chromosome is selected, in order of preference based on the maximum Hamming distance between the male chromosome and the female chromosome or The highest fitness value of male chromosome (if more than one male chromosome is having the maximum Hamming distance existed), or Random selection. The selections of crossover and mutation operators are achieved by running the fuzzy logic controllers, the crossover and mutation probabilities are varied on the basis of the phenotype and genotype characteristics of the chromosome population. Computational experiments are conducted on the proposed techniques and the results are compared with some other operators, heuristic and local search algorithms commonly used for solving p-median problems published in the literature.

Keywords: genetic algorithm, fuzzy system, chaos, sexual selection

Procedia PDF Downloads 379
3700 The Relationship between Transcendence and Psychological Well-Being: A Systematic Scientific Literature Review

Authors: Monir Ahmed

Abstract:

The main purpose of this literature review was to investigate the existing quantitative clinical studies on the relationship between transcendence and psychological well-being. The primary objective of the literature review is to determine whether the existing studies adequately demonstrate the relationship between transcendence and psychological well-being, including spiritual well-being. A further objective of this literature review is to see if the ‘creatio ex nihilo’ doctrine is necessary to understand transcendence and its relationship with psychological well-being. Systematic literature review methods including studies identified from search engines, extracting data from the studies and assessing their quality for the planned review were used. The outcome of this literature review indicates that self-transcendence (STa), spiritual transcendence (STb) are positively related to psychological well-being. However, such positive relationships present limited scope for understanding transcendence and its relationship with well-being. The findings of this review support the need for further research in the area of transcendence and well-being. This literature review reveals the importance of developing a new transcendence tool for determining an individual’s ability to transcend and the relationship between his/her ability for transcendence and psychological well-being. The author of this paper proposes that the inclusion of the theological doctrine (‘creatio ex nihilo’) in understanding transcendence and psychological well-being is crucial, necessary and unavoidable.

Keywords: transcendence, psychological well-being, self-transcendence, spiritual transcendence, ‘creatio ex nihilo’

Procedia PDF Downloads 128
3699 SAR and B₁ Considerations for Multi-Nuclear RF Body Coils

Authors: Ria Forner

Abstract:

Introduction: Due to increases in the SNR at 7T and above, it becomes more favourable to make use of X-nuclear imaging. Integrated body coils tuned to 120MHz for 31P, 79MHz for 23Na, and 75 MHz for 13C at 7T were simulated with a human male, female, or child body model to assess strategies of use for metabolic MR imaging in the body. Methods: B1 and SAR efficiencies in the heart, liver, spleen, and kidneys were assessed using numerical simulations over the three frequencies with phase shimming. Results: B1+ efficiency is highly variable over the different organs, particularly for the highest frequency; however, local SAR efficiency remains relatively constant over the frequencies in all subjects. Although the optimal phase settings vary, one generic phase setting can be identified for each frequency at which the penalty in B1+ is at a max of 10%. Discussion: The simulations provide practical strategies for power optimization, B1 management, and maintaining safety. As expected, the B1 field is similar at 75MHz and 79MHz, but reduced at 120MHz. However, the B1 remains relatively constant when normalised by the square root of the peak local SAR. This is in contradiction to generalized SAR considerations of 1H MRI at different field strengths, which is defined by global SAR instead. Conclusion: Although the B1 decreases with frequency, SAR efficiency remains constant throughout the investigated frequency range. It is possible to shim the body coil to obtain a maximum of 10% extra B1+ in a specific organ in a body when compared to a generic setting.

Keywords: birdcage, multi-nuclear, B1 shimming, 7 Tesla MRI, liver, kidneys, heart, spleen

Procedia PDF Downloads 58
3698 The Use of Learning Management Systems during Emerging the Tacit Knowledge

Authors: Ercan Eker, Muhammer Karaman, Akif Aslan, Hakan Tanrikuluoglu

Abstract:

Deficiency of institutional memory and knowledge management can result in information security breaches, loss of prestige and trustworthiness and the worst the loss of know-how and institutional knowledge. Traditional learning management within organizations is generally handled by personal efforts. That kind of struggle mostly depends on personal desire, motivation and institutional belonging. Even if an organization has highly motivated employees at a certain time, the institutional knowledge and memory life cycle will generally remain limited to these employees’ spending time in this organization. Having a learning management system in an organization can sustain the institutional memory, knowledge and know-how in the organization. Learning management systems are much more needed especially in public organizations where the job rotation is frequently seen and managers are appointed periodically. However, a learning management system should not be seen as an organizations’ website. It is a more comprehensive, interactive and user-friendly knowledge management tool for organizations. In this study, the importance of using learning management systems in the process of emerging tacit knowledge is underlined.

Keywords: knowledge management, learning management systems, tacit knowledge, institutional memory

Procedia PDF Downloads 374
3697 Assessing Firm Readiness to Implement Cloud Computing: Toward a Comprehensive Model

Authors: Seyed Mohammadbagher Jafari, Elahe Mahdizadeh, Masomeh Ghahremani

Abstract:

Nowadays almost all organizations depend on information systems to run their businesses. Investment on information systems and their maintenance to keep them always in best situation to support firm business is one of the main issues for every organization. The new concept of cloud computing was developed as a technical and economic model to address this issue. In cloud computing the computing resources, including networks, applications, hardwares and services are configured as needed and are available at the moment of request. However, migration to cloud is not an easy task and there are many issues that should be taken into account. This study tries to provide a comprehensive model to assess a firm readiness to implement cloud computing. By conducting a systematic literature review, four dimensions of readiness were extracted which include technological, human, organizational and environmental dimensions. Every dimension has various criteria that have been discussed in details. This model provides a framework for cloud computing readiness assessment. Organizations that intend to migrate to cloud can use this model as a tool to assess their firm readiness before making any decision on cloud implementation.

Keywords: cloud computing, human readiness, organizational readiness, readiness assessment model

Procedia PDF Downloads 390
3696 Multi-Criteria Decision Support System for Modeling of Civic Facilities Using GIS Applications: A Case Study of F-11, Islamabad

Authors: Asma Shaheen Hashmi, Omer Riaz, Khalid Mahmood, Fahad Ullah, Tanveer Ahmad

Abstract:

The urban landscapes are being change with the population growth and advancements in new technologies. The urban sprawl pattern and utilizes are related to the local socioeconomic and physical condition. Urban policy decisions are executed mostly through spatial planning. A decision support system (DSS) is very powerful tool which provides flexible knowledge base method for urban planning. An application was developed using geographical information system (GIS) for urban planning. A scenario based DSS was developed to integrate the hierarchical muti-criteria data of different aspects of urban landscape. These were physical environment, the dumping site, spatial distribution of road network, gas and water supply lines, and urban watershed management, selection criteria for new residential, recreational, commercial and industrial sites. The model provided a framework to incorporate the sustainable future development. The data can be entered dynamically by planners according to the appropriate criteria for the management of urban landscapes.

Keywords: urban, GIS, spatial, criteria

Procedia PDF Downloads 631
3695 Speeding Up Lenia: A Comparative Study Between Existing Implementations and CUDA C++ with OpenGL Interop

Authors: L. Diogo, A. Legrand, J. Nguyen-Cao, J. Rogeau, S. Bornhofen

Abstract:

Lenia is a system of cellular automata with continuous states, space and time, which surprises not only with the emergence of interesting life-like structures but also with its beauty. This paper reports ongoing research on a GPU implementation of Lenia using CUDA C++ and OpenGL Interoperability. We demonstrate how CUDA as a low-level GPU programming paradigm allows optimizing performance and memory usage of the Lenia algorithm. A comparative analysis through experimental runs with existing implementations shows that the CUDA implementation outperforms the others by one order of magnitude or more. Cellular automata hold significant interest due to their ability to model complex phenomena in systems with simple rules and structures. They allow exploring emergent behavior such as self-organization and adaptation, and find applications in various fields, including computer science, physics, biology, and sociology. Unlike classic cellular automata which rely on discrete cells and values, Lenia generalizes the concept of cellular automata to continuous space, time and states, thus providing additional fluidity and richness in emerging phenomena. In the current literature, there are many implementations of Lenia utilizing various programming languages and visualization libraries. However, each implementation also presents certain drawbacks, which serve as motivation for further research and development. In particular, speed is a critical factor when studying Lenia, for several reasons. Rapid simulation allows researchers to observe the emergence of patterns and behaviors in more configurations, on bigger grids and over longer periods without annoying waiting times. Thereby, they enable the exploration and discovery of new species within the Lenia ecosystem more efficiently. Moreover, faster simulations are beneficial when we include additional time-consuming algorithms such as computer vision or machine learning to evolve and optimize specific Lenia configurations. We developed a Lenia implementation for GPU using the C++ and CUDA programming languages, and CUDA/OpenGL Interoperability for immediate rendering. The goal of our experiment is to benchmark this implementation compared to the existing ones in terms of speed, memory usage, configurability and scalability. In our comparison we focus on the most important Lenia implementations, selected for their prominence, accessibility and widespread use in the scientific community. The implementations include MATLAB, JavaScript, ShaderToy GLSL, Jupyter, Rust and R. The list is not exhaustive but provides a broad view of the principal current approaches and their respective strengths and weaknesses. Our comparison primarily considers computational performance and memory efficiency, as these factors are critical for large-scale simulations, but we also investigate the ease of use and configurability. The experimental runs conducted so far demonstrate that the CUDA C++ implementation outperforms the other implementations by one order of magnitude or more. The benefits of using the GPU become apparent especially with larger grids and convolution kernels. However, our research is still ongoing. We are currently exploring the impact of several software design choices and optimization techniques, such as convolution with Fast Fourier Transforms (FFT), various GPU memory management scenarios, and the trade-off between speed and accuracy using single versus double precision floating point arithmetic. The results will give valuable insights into the practice of parallel programming of the Lenia algorithm, and all conclusions will be thoroughly presented in the conference paper. The final version of our CUDA C++ implementation will be published on github and made freely accessible to the Alife community for further development.

Keywords: artificial life, cellular automaton, GPU optimization, Lenia, comparative analysis.

Procedia PDF Downloads 32
3694 A Cloud-Based Spectrum Database Approach for Licensed Shared Spectrum Access

Authors: Hazem Abd El Megeed, Mohamed El-Refaay, Norhan Magdi Osman

Abstract:

Spectrum scarcity is a challenging obstacle in wireless communications systems. It hinders the introduction of innovative wireless services and technologies that require larger bandwidth comparing to legacy technologies. In addition, the current worldwide allocation of radio spectrum bands is already congested and can not afford additional squeezing or optimization to accommodate new wireless technologies. This challenge is a result of accumulative contributions from different factors that will be discussed later in this paper. One of these factors is the radio spectrum allocation policy governed by national regulatory authorities nowadays. The framework for this policy allocates specified portion of radio spectrum to a particular wireless service provider on exclusive utilization basis. This allocation is executed according to technical specification determined by the standard bodies of each Radio Access Technology (RAT). Dynamic access of spectrum is a framework for flexible utilization of radio spectrum resources. In this framework there is no exclusive allocation of radio spectrum and even the public safety agencies can share their spectrum bands according to a governing policy and service level agreements. In this paper, we explore different methods for accessing the spectrum dynamically and its associated implementation challenges.

Keywords: licensed shared access, cognitive radio, spectrum sharing, spectrum congestion, dynamic spectrum access, spectrum database, spectrum trading, reconfigurable radio systems, opportunistic spectrum allocation (OSA)

Procedia PDF Downloads 416
3693 A Condition-Based Maintenance Policy for Multi-Unit Systems Subject to Deterioration

Authors: Nooshin Salari, Viliam Makis

Abstract:

In this paper, we propose a condition-based maintenance policy for multi-unit systems considering the existence of economic dependency among units. We consider a system composed of N identical units, where each unit deteriorates independently. Deterioration process of each unit is modeled as a three-state continuous time homogeneous Markov chain with two working states and a failure state. The average production rate of units varies in different working states and demand rate of the system is constant. Units are inspected at equidistant time epochs, and decision regarding performing maintenance is determined by the number of units in the failure state. If the total number of units in the failure state exceeds a critical level, maintenance is initiated, where units in failed state are replaced correctively and deteriorated state units are maintained preventively. Our objective is to determine the optimal number of failed units to initiate maintenance minimizing the long run expected average cost per unit time. The problem is formulated and solved in the semi-Markov decision process (SMDP) framework. A numerical example is developed to demonstrate the proposed policy and the comparison with the corrective maintenance policy is presented.

Keywords: reliability, maintenance optimization, semi-Markov decision process, production

Procedia PDF Downloads 158
3692 Analysis of Influence of Geometrical Set of Nozzles on Aerodynamic Drag Level of a Hero’s Based Steam Turbine

Authors: Mateusz Paszko, Miroslaw Wendeker, Adam Majczak

Abstract:

High temperature waste energy offers a number of management options. The most common energy recuperation systems, that are actually used to utilize energy from the high temperature sources are steam turbines working in a high pressure and temperature closed cycles. Due to the high costs of production of energy recuperation systems, especially rotary turbine discs equipped with blades, currently used solutions are limited in use with waste energy sources of temperature below 100 °C. This study presents the results of simulating the flow of the water vapor in various configurations of flow ducts in a reaction steam turbine based on Hero’s steam turbine. The simulation was performed using a numerical model and the ANSYS Fluent software. Simulation computations were conducted with use of the water vapor as an internal agent powering the turbine, which is fully safe for an environment in case of a device failure. The conclusions resulting from the conducted numerical computations should allow for optimization of the flow ducts geometries, in order to achieve the greatest possible efficiency of the turbine. It is expected that the obtained results should be useful for further works related to the development of the final version of a low drag steam turbine dedicated for low cost energy recuperation systems.

Keywords: energy recuperation, CFD analysis, waste energy, steam turbine

Procedia PDF Downloads 204
3691 Leveraging Digital Transformation Initiatives and Artificial Intelligence to Optimize Readiness and Simulate Mission Performance across the Fleet

Authors: Justin Woulfe

Abstract:

Siloed logistics and supply chain management systems throughout the Department of Defense (DOD) has led to disparate approaches to modeling and simulation (M&S), a lack of understanding of how one system impacts the whole, and issues with “optimal” solutions that are good for one organization but have dramatic negative impacts on another. Many different systems have evolved to try to understand and account for uncertainty and try to reduce the consequences of the unknown. As the DoD undertakes expansive digital transformation initiatives, there is an opportunity to fuse and leverage traditionally disparate data into a centrally hosted source of truth. With a streamlined process incorporating machine learning (ML) and artificial intelligence (AI), advanced M&S will enable informed decisions guiding program success via optimized operational readiness and improved mission success. One of the current challenges is to leverage the terabytes of data generated by monitored systems to provide actionable information for all levels of users. The implementation of a cloud-based application analyzing data transactions, learning and predicting future states from current and past states in real-time, and communicating those anticipated states is an appropriate solution for the purposes of reduced latency and improved confidence in decisions. Decisions made from an ML and AI application combined with advanced optimization algorithms will improve the mission success and performance of systems, which will improve the overall cost and effectiveness of any program. The Systecon team constructs and employs model-based simulations, cutting across traditional silos of data, aggregating maintenance, and supply data, incorporating sensor information, and applying optimization and simulation methods to an as-maintained digital twin with the ability to aggregate results across a system’s lifecycle and across logical and operational groupings of systems. This coupling of data throughout the enterprise enables tactical, operational, and strategic decision support, detachable and deployable logistics services, and configuration-based automated distribution of digital technical and product data to enhance supply and logistics operations. As a complete solution, this approach significantly reduces program risk by allowing flexible configuration of data, data relationships, business process workflows, and early test and evaluation, especially budget trade-off analyses. A true capability to tie resources (dollars) to weapon system readiness in alignment with the real-world scenarios a warfighter may experience has been an objective yet to be realized to date. By developing and solidifying an organic capability to directly relate dollars to readiness and to inform the digital twin, the decision-maker is now empowered through valuable insight and traceability. This type of educated decision-making provides an advantage over the adversaries who struggle with maintaining system readiness at an affordable cost. The M&S capability developed allows program managers to independently evaluate system design and support decisions by quantifying their impact on operational availability and operations and support cost resulting in the ability to simultaneously optimize readiness and cost. This will allow the stakeholders to make data-driven decisions when trading cost and readiness throughout the life of the program. Finally, sponsors are available to validate product deliverables with efficiency and much higher accuracy than in previous years.

Keywords: artificial intelligence, digital transformation, machine learning, predictive analytics

Procedia PDF Downloads 154
3690 Technical and Economic Evaluation of Harmonic Mitigation from Offshore Wind Power Plants by Transmission Owners

Authors: A. Prajapati, K. L. Koo, F. Ghassemi, M. Mulimakwenda

Abstract:

In the UK, as the volume of non-linear loads connected to transmission grid continues to rise steeply, the harmonic distortion levels on transmission network are becoming a serious concern for the network owners and system operators. This paper outlines the findings of the study conducted to verify the proposal that the harmonic mitigation could be optimized and can be managed economically and effectively at the transmission network level by the Transmission Owner (TO) instead of the individual polluter connected to the grid. Harmonic mitigation studies were conducted on selected regions of the transmission network in England for recently connected offshore wind power plants to strategize and optimize selected harmonic filter options. The results – filter volume and capacity – were then compared against the mitigation measures adopted by the individual connections. Estimation ratios were developed based on the actual installed and optimal proposed filters. These estimation ratios were then used to derive harmonic filter requirements for future contracted connections. The study has concluded that a saving of 37% in the filter volume/capacity could be achieved if the TO is to centrally manage the harmonic mitigation instead of individual polluter installing their own mitigation solution.

Keywords: C-type filter, harmonics, optimization, offshore wind farms, interconnectors, HVDC, renewable energy, transmission owner

Procedia PDF Downloads 154
3689 Multiresolution Mesh Blending for Surface Detail Reconstruction

Authors: Honorio Salmeron Valdivieso, Andy Keane, David Toal

Abstract:

In the area of mechanical reverse engineering, processes often encounter difficulties capturing small, highly localized surface information. This could be the case if a physical turbine was 3D scanned for lifecycle management or robust design purposes, with interest on eroded areas or scratched coating. The limitation partly is due to insufficient automated frameworks for handling -localized - surface information during the reverse engineering pipeline. We have developed a tool for blending surface patches with arbitrary irregularities into a base body (e.g. a CAD solid). The approach aims to transfer small surface features while preserving their shape and relative placement by using a multi-resolution scheme and rigid deformations. Automating this process enables the inclusion of outsourced surface information in CAD models, including samples prepared in mesh handling software, or raw scan information discarded in the early stages of reverse engineering reconstruction.

Keywords: application lifecycle management, multiresolution deformation, reverse engineering, robust design, surface blending

Procedia PDF Downloads 136
3688 Estimating Gait Parameter from Digital RGB Camera Using Real Time AlphaPose Learning Architecture

Authors: Murad Almadani, Khalil Abu-Hantash, Xinyu Wang, Herbert Jelinek, Kinda Khalaf

Abstract:

Gait analysis is used by healthcare professionals as a tool to gain a better understanding of the movement impairment and track progress. In most circumstances, monitoring patients in their real-life environments with low-cost equipment such as cameras and wearable sensors is more important. Inertial sensors, on the other hand, cannot provide enough information on angular dynamics. This research offers a method for tracking 2D joint coordinates using cutting-edge vision algorithms and a single RGB camera. We provide an end-to-end comprehensive deep learning pipeline for marker-less gait parameter estimation, which, to our knowledge, has never been done before. To make our pipeline function in real-time for real-world applications, we leverage the AlphaPose human posture prediction model and a deep learning transformer. We tested our approach on the well-known GPJATK dataset, which produces promising results.

Keywords: gait analysis, human pose estimation, deep learning, real time gait estimation, AlphaPose, transformer

Procedia PDF Downloads 113
3687 Identify Users Behavior from Mobile Web Access Logs Using Automated Log Analyzer

Authors: Bharat P. Modi, Jayesh M. Patel

Abstract:

Mobile Internet is acting as a major source of data. As the number of web pages continues to grow the Mobile web provides the data miners with just the right ingredients for extracting information. In order to cater to this growing need, a special term called Mobile Web mining was coined. Mobile Web mining makes use of data mining techniques and deciphers potentially useful information from web data. Web Usage mining deals with understanding the behavior of users by making use of Mobile Web Access Logs that are generated on the server while the user is accessing the website. A Web access log comprises of various entries like the name of the user, his IP address, a number of bytes transferred time-stamp etc. A variety of Log Analyzer tools exists which help in analyzing various things like users navigational pattern, the part of the website the users are mostly interested in etc. The present paper makes use of such log analyzer tool called Mobile Web Log Expert for ascertaining the behavior of users who access an astrology website. It also provides a comparative study between a few log analyzer tools available.

Keywords: mobile web access logs, web usage mining, web server, log analyzer

Procedia PDF Downloads 356
3686 Forecasting Cancers Cases in Algeria Using Double Exponential Smoothing Method

Authors: Messis A., Adjebli A., Ayeche R., Talbi M., Tighilet K., Louardiane M.

Abstract:

Cancers are the second cause of death worldwide. Prevalence and incidence of cancers is getting increased by aging and population growth. This study aims to predict and modeling the evolution of breast, Colorectal, Lung, Bladder and Prostate cancers over the period of 2014-2019. In this study, data were analyzed using time series analysis with double exponential smoothing method to forecast the future pattern. To describe and fit the appropriate models, Minitab statistical software version 17 was used. Between 2014 and 2019, the overall trend in the raw number of new cancer cases registered has been increasing over time; the change in observations over time has been increasing. Our forecast model is validated since we have good prediction for the period 2020 and data not available for 2021 and 2022. Time series analysis showed that the double exponential smoothing is an efficient tool to model the future data on the raw number of new cancer cases.

Keywords: cancer, time series, prediction, double exponential smoothing

Procedia PDF Downloads 81
3685 Mathematical Models for GMAW and FCAW Welding Processes for Structural Steels Used in the Oil Industry

Authors: Carlos Alberto Carvalho Castro, Nancy Del Ducca Barbedo, Edmilsom Otoni Côrrea

Abstract:

With increase the production oil and lines transmission gases that are in ample expansion, the industries medium and great transport they had to adapt itself to supply the demand manufacture in this fabrication segment. In this context, two welding processes have been more extensively used: the GMAW (Gas Metal Arc Welding) and the FCAW (Flux Cored Arc Welding). In this work, welds using these processes were carried out in flat position on ASTM A-36 carbon steel plates in order to make a comparative evaluation between them concerning to mechanical and metallurgical properties. A statistical tool based on technical analysis and design of experiments, DOE, from the Minitab software was adopted. For these analyses, the voltage, current, and welding speed, in both processes, were varied. As a result, it was observed that the welds in both processes have different characteristics in relation to the metallurgical properties and performance, but they present good weldability, satisfactory mechanical strength e developed mathematical models.

Keywords: Flux Cored Arc Welding (FCAW), Gas Metal Arc Welding (GMAW), Design of Experiments (DOE), mathematical models

Procedia PDF Downloads 555
3684 Finite Volume Method Simulations of GaN Growth Process in MOVPE Reactor

Authors: J. Skibinski, P. Caban, T. Wejrzanowski, K. J. Kurzydlowski

Abstract:

In the present study, numerical simulations of heat and mass transfer during gallium nitride growth process in Metal Organic Vapor Phase Epitaxy reactor AIX-200/4RF-S is addressed. Existing knowledge about phenomena occurring in the MOVPE process allows to produce high quality nitride based semiconductors. However, process parameters of MOVPE reactors can vary in certain ranges. Main goal of this study is optimization of the process and improvement of the quality of obtained crystal. In order to investigate this subject a series of computer simulations have been performed. Numerical simulations of heat and mass transfer in GaN epitaxial growth process have been performed to determine growth rate for various mass flow rates and pressures of reagents. According to the fact that it’s impossible to determine experimentally the exact distribution of heat and mass transfer inside the reactor during the process, modeling is the only solution to understand the process precisely. Main heat transfer mechanisms during MOVPE process are convection and radiation. Correlation of modeling results with the experiment allows to determine optimal process parameters for obtaining crystals of highest quality.

Keywords: Finite Volume Method, semiconductors, epitaxial growth, metalorganic vapor phase epitaxy, gallium nitride

Procedia PDF Downloads 391
3683 Optimization of Bio-Diesel Production from Rubber Seed Oils

Authors: Pawit Tangviroon, Apichit Svang-Ariyaskul

Abstract:

Rubber seed oil is an attractive alternative feedstock for biodiesel production because it is not related to food-chain plant. Rubber seed oil contains large amount of free fatty acids, which causes problem in biodiesel production. Free fatty acids can react with alkaline catalyst in biodiesel production. Acid esterification is used as pre-treatment to convert unwanted compound to desirable biodiesel. Phase separation of oil and methanol occurs at low ratio of methanol to oil and causes low reaction rate and conversion. Acid esterification requires large excess of methanol in order to increase the miscibility of methanol in oil and accordingly, it is a more expensive separation process. In this work, the kinetics of esterification of rubber seed oil with methanol is developed from available experimental results. Reactive distillation process was designed by using Aspen Plus program. The effects of operating parameters such as feed ratio, molar reflux ratio, feed temperature, and feed stage are investigated in order to find the optimum conditions. Results show that the reactive distillation process is proved to be better than conventional process. It consumes less feed methanol and less energy while yielding higher product purity than the conventional process. This work can be used as a guideline for further development to industrial scale of biodiesel production using reactive distillation.

Keywords: biodiesel, reactive distillation, rubber seed oil, transesterification

Procedia PDF Downloads 339
3682 Maximizing Bidirectional Green Waves for Major Road Axes

Authors: Christian Liebchen

Abstract:

Both from an environmental perspective and with respect to road traffic flow quality, planning so-called green waves along major road axes is a well-established target for traffic engineers. For one-way road axes (e.g. the Avenues in Manhattan), this is a trivial downstream task. For bidirectional arterials, the well-known necessary condition for establishing a green wave in both directions is that the driving times between two subsequent crossings must be an integer multiple of half of the cycle time of the signal programs at the nodes. In this paper, we propose an integer linear optimization model to establish fixed-time green waves in both directions that are as long and as wide as possible, even in the situation where the driving time condition is not fulfilled. In particular, we are considering an arterial along whose nodes separate left-turn signal groups are realized. In our computational results, we show that scheduling left-turn phases before or after the straight phases can reduce waiting times along the arterial. Moreover, we show that there is always a solution with green waves in both directions that are as long and as wide as possible, where absolute priority is put on just one direction. Compared to optimizing both directions together, establishing an ideal green wave into one direction can only provide suboptimal quality when considering prioritized parts of a green band (e.g., first few seconds).

Keywords: traffic light coordination, synchronization, phase sequencing, green waves, integer programming

Procedia PDF Downloads 109
3681 Effects of Various Wavelet Transforms in Dynamic Analysis of Structures

Authors: Seyed Sadegh Naseralavi, Sadegh Balaghi, Ehsan Khojastehfar

Abstract:

Time history dynamic analysis of structures is considered as an exact method while being computationally intensive. Filtration of earthquake strong ground motions applying wavelet transform is an approach towards reduction of computational efforts, particularly in optimization of structures against seismic effects. Wavelet transforms are categorized into continuum and discrete transforms. Since earthquake strong ground motion is a discrete function, the discrete wavelet transform is applied in the present paper. Wavelet transform reduces analysis time by filtration of non-effective frequencies of strong ground motion. Filtration process may be repeated several times while the approximation induces more errors. In this paper, strong ground motion of earthquake has been filtered once applying each wavelet. Strong ground motion of Northridge earthquake is filtered applying various wavelets and dynamic analysis of sampled shear and moment frames is implemented. The error, regarding application of each wavelet, is computed based on comparison of dynamic response of sampled structures with exact responses. Exact responses are computed by dynamic analysis of structures applying non-filtered strong ground motion.

Keywords: wavelet transform, computational error, computational duration, strong ground motion data

Procedia PDF Downloads 372