Search results for: static analysis tools
30035 Lateral Torsional Buckling of Steel Thin-Walled Beams with Lateral Restraints
Authors: Ivan Balázs, Jindřich Melcher
Abstract:
Metal thin-walled members have been widely used in building industry. Usually they are utilized as purlins, girts or ceiling beams. Due to slenderness of thin-walled cross-sections these structural members are prone to stability problems (e.g. flexural buckling, lateral torsional buckling). If buckling is not constructionally prevented their resistance is limited by buckling strength. In practice planar members of roof or wall cladding can be attached to thin-walled members. These elements reduce displacement of thin-walled members and therefore increase their buckling strength. If this effect is taken into static assessment more economical sections of thin-walled members might be utilized and certain savings of material might be achieved. This paper focuses on problem of determination of critical load of steel thin-walled beams with lateral continuous restraint which is crucial for lateral torsional buckling assessment.Keywords: beam, buckling, numerical analysis, stability, steel
Procedia PDF Downloads 33030034 Advances in Design Decision Support Tools for Early-stage Energy-Efficient Architectural Design: A Review
Authors: Maryam Mohammadi, Mohammadjavad Mahdavinejad, Mojtaba Ansari
Abstract:
The main driving force for increasing movement towards the design of High-Performance Buildings (HPB) are building codes and rating systems that address the various components of the building and their impact on the environment and energy conservation through various methods like prescriptive methods or simulation-based approaches. The methods and tools developed to meet these needs, which are often based on building performance simulation tools (BPST), have limitations in terms of compatibility with the integrated design process (IDP) and HPB design, as well as use by architects in the early stages of design (when the most important decisions are made). To overcome these limitations in recent years, efforts have been made to develop Design Decision Support Systems, which are often based on artificial intelligence. Numerous needs and steps for designing and developing a Decision Support System (DSS), which complies with the early stages of energy-efficient architecture design -consisting of combinations of different methods in an integrated package- have been listed in the literature. While various review studies have been conducted in connection with each of these techniques (such as optimizations, sensitivity and uncertainty analysis, etc.) and their integration of them with specific targets; this article is a critical and holistic review of the researches which leads to the development of applicable systems or introduction of a comprehensive framework for developing models complies with the IDP. Information resources such as Science Direct and Google Scholar are searched using specific keywords and the results are divided into two main categories: Simulation-based DSSs and Meta-simulation-based DSSs. The strengths and limitations of different models are highlighted, two general conceptual models are introduced for each category and the degree of compliance of these models with the IDP Framework is discussed. The research shows movement towards Multi-Level of Development (MOD) models, well combined with early stages of integrated design (schematic design stage and design development stage), which are heuristic, hybrid and Meta-simulation-based, relies on Big-real Data (like Building Energy Management Systems Data or Web data). Obtaining, using and combining of these data with simulation data to create models with higher uncertainty, more dynamic and more sensitive to context and culture models, as well as models that can generate economy-energy-efficient design scenarios using local data (to be more harmonized with circular economy principles), are important research areas in this field. The results of this study are a roadmap for researchers and developers of these tools.Keywords: integrated design process, design decision support system, meta-simulation based, early stage, big data, energy efficiency
Procedia PDF Downloads 16230033 Water Management of Polish Agriculture and Adaptation to Climate Change
Authors: Dorota M. Michalak
Abstract:
The agricultural sector, due to the growing demand for food and over-exploitation of the natural environment, contributes to the deepening of climate change, on the one hand, and on the other hand, shrinking freshwater resources, as a negative effect of climate change, threaten the food security of each country. Therefore, adaptation measures to climate change should take into account effective water management and seek solutions ensuring food production at an unchanged or higher level, while not burdening the environment and not contributing to the worsening of the negative consequences of climate change. The problems of Poland's water management result not only from relatively small, natural water resources but to a large extent on the low efficiency of their use. Appropriate agricultural practices and state solutions in this field can contribute to achieving significant benefits in terms of economical water management in agriculture, providing a greater amount of water that could also be used for other purposes, including for purposes related to environmental protection. The aim of the article is to determine the level of use of water resources in Polish agriculture and the advancement of measures aimed at adapting Polish agriculture in the field of water management to climate change. The study provides knowledge about Polish legal regulations and water management tools, the shaping of water policy of Polish agriculture against the background of EU countries and other sources of energy, and measures supporting Polish agricultural holdings in the effective management of water resources run by state budget institutions. In order to achieve the above-mentioned goals, the author used research tools such as the analysis of existing sources and a survey conducted among five groups of entities, i.e. agricultural advisory centers and departments, agricultural, rural and environmental protection departments, regional water management boards, provincial agricultural chambers and restructuring and modernization of agriculture. The main conclusion of the analyses carried out is the low use of water in Polish agriculture in relation to other EU countries, other sources of intake in Poland, as well as irrigation. The analysis allows us to observe another problem, which is the lack of reporting and data collection, which is extremely important from the point of view of the effectiveness of adaptation measures to climate change. The results obtained from the survey indicate a very low level of support for government institutions in the implementation of adaptation measures to climate change and the water management of Polish farms. Some of the basic problems of the adaptation policy to change climate with regard to water management in Polish agriculture include a lack of knowledge regarding climate change, the possibilities of adapting, the available tools or ways to rationalize the use of water resources. It also refers to the lack of ordering procedures and the separation of responsibility with a proper territorial unit, non-functioning channels of information flow and practically low effects.Keywords: water management, adaptation policy, agriculture, climate change
Procedia PDF Downloads 14230032 Evotrader: Bitcoin Trading Using Evolutionary Algorithms on Technical Analysis and Social Sentiment Data
Authors: Martin Pellon Consunji
Abstract:
Due to the rise in popularity of Bitcoin and other crypto assets as a store of wealth and speculative investment, there is an ever-growing demand for automated trading tools, such as bots, in order to gain an advantage over the market. Traditionally, trading in the stock market was done by professionals with years of training who understood patterns and exploited market opportunities in order to gain a profit. However, nowadays a larger portion of market participants are at minimum aided by market-data processing bots, which can generally generate more stable signals than the average human trader. The rise in trading bot usage can be accredited to the inherent advantages that bots have over humans in terms of processing large amounts of data, lack of emotions of fear or greed, and predicting market prices using past data and artificial intelligence, hence a growing number of approaches have been brought forward to tackle this task. However, the general limitation of these approaches can still be broken down to the fact that limited historical data doesn’t always determine the future, and that a lot of market participants are still human emotion-driven traders. Moreover, developing markets such as those of the cryptocurrency space have even less historical data to interpret than most other well-established markets. Due to this, some human traders have gone back to the tried-and-tested traditional technical analysis tools for exploiting market patterns and simplifying the broader spectrum of data that is involved in making market predictions. This paper proposes a method which uses neuro evolution techniques on both sentimental data and, the more traditionally human-consumed, technical analysis data in order to gain a more accurate forecast of future market behavior and account for the way both automated bots and human traders affect the market prices of Bitcoin and other cryptocurrencies. This study’s approach uses evolutionary algorithms to automatically develop increasingly improved populations of bots which, by using the latest inflows of market analysis and sentimental data, evolve to efficiently predict future market price movements. The effectiveness of the approach is validated by testing the system in a simulated historical trading scenario, a real Bitcoin market live trading scenario, and testing its robustness in other cryptocurrency and stock market scenarios. Experimental results during a 30-day period show that this method outperformed the buy and hold strategy by over 260% in terms of net profits, even when taking into consideration standard trading fees.Keywords: neuro-evolution, Bitcoin, trading bots, artificial neural networks, technical analysis, evolutionary algorithms
Procedia PDF Downloads 12330031 Fast Generation of High-Performance Driveshafts: A Digital Approach to Automated Linked Topology and Design Optimization
Authors: Willi Zschiebsch, Alrik Dargel, Sebastian Spitzer, Philipp Johst, Robert Böhm, Niels Modler
Abstract:
In this article, we investigate an approach that digitally links individual development process steps by using the drive shaft of an aircraft engine as a representative example of a fiber polymer composite. Such high-performance, lightweight composite structures have many adjustable parameters that influence the mechanical properties. Only a combination of optimal parameter values can lead to energy efficient lightweight structures. The development tools required for the Engineering Design Process (EDP) are often isolated solutions, and their compatibility with each other is limited. A digital framework is presented in this study, which allows individual specialised tools to be linked via the generated data in such a way that automated optimization across programs becomes possible. This is demonstrated using the example of linking geometry generation with numerical structural analysis. The proposed digital framework for automated design optimization demonstrates the feasibility of developing a complete digital approach to design optimization. The methodology shows promising potential for achieving optimal solutions in terms of mass, material utilization, eigenfrequency, and deformation under lateral load with less development effort. The development of such a framework is an important step towards promoting a more efficient design approach that can lead to stable and balanced results.Keywords: digital linked process, composite, CFRP, multi-objective, EDP, NSGA-2, NSGA-3, TPE
Procedia PDF Downloads 7630030 Control of Hybrid System Using Fuzzy Logic
Authors: Faiza Mahi, Fatima Debbat, Mohamed Fayçal Khelfi
Abstract:
This paper proposes a control approach using Fuzzy Lo system. More precisely, the study focuses on the improvement of users service in terms of analysis and control of a transportation system their waiting times in the exchange platforms of passengers. Many studies have been developed in the literature for such problematic, and many control tools are proposed. In this paper we focus on the use of fuzzy logic technique to control the system during its evolution in order to minimize the arrival gap of connected transportation means at the exchange points of passengers. An example of illustration is worked out and the obtained results are reported. an important area of research is the modeling and simulation ordering system. We describe an approach to analysis using Fuzzy Logic. The hybrid simulator developed in toolbox Matlab consists calculation of waiting time transportation mode.Keywords: Fuzzy logic, Hybrid system, Waiting Time, Transportation system, Control
Procedia PDF Downloads 55530029 Pose-Dependency of Machine Tool Structures: Appearance, Consequences, and Challenges for Lightweight Large-Scale Machines
Authors: S. Apprich, F. Wulle, A. Lechler, A. Pott, A. Verl
Abstract:
Large-scale machine tools for the manufacturing of large work pieces, e.g. blades, casings or gears for wind turbines, feature pose-dependent dynamic behavior. Small structural damping coefficients lead to long decay times for structural vibrations that have negative impacts on the production process. Typically, these vibrations are handled by increasing the stiffness of the structure by adding mass. That is counterproductive to the needs of sustainable manufacturing as it leads to higher resource consumption both in material and in energy. Recent research activities have led to higher resource efficiency by radical mass reduction that rely on control-integrated active vibration avoidance and damping methods. These control methods depend on information describing the dynamic behavior of the controlled machine tools in order to tune the avoidance or reduction method parameters according to the current state of the machine. The paper presents the appearance, consequences and challenges of the pose-dependent dynamic behavior of lightweight large-scale machine tool structures in production. The paper starts with the theoretical introduction of the challenges of lightweight machine tool structures resulting from reduced stiffness. The statement of the pose-dependent dynamic behavior is corroborated by the results of the experimental modal analysis of a lightweight test structure. Afterwards, the consequences of the pose-dependent dynamic behavior of lightweight machine tool structures for the use of active control and vibration reduction methods are explained. Based on the state of the art on pose-dependent dynamic machine tool models and the modal investigation of an FE-model of the lightweight test structure, the criteria for a pose-dependent model for use in vibration reduction are derived. The description of the approach for a general pose-dependent model of the dynamic behavior of large lightweight machine tools that provides the necessary input to the aforementioned vibration avoidance and reduction methods to properly tackle machine vibrations is the outlook of the paper.Keywords: dynamic behavior, lightweight, machine tool, pose-dependency
Procedia PDF Downloads 45930028 Effect of Cost Control and Cost Reduction Techniques in Organizational Performance
Authors: Babatunde Akeem Lawal
Abstract:
In any organization, the primary aim is to maximize profit, but the major challenges facing them is the increase in cost of operation because of this there is increase in cost of production that could lead to inevitable cost control and cost reduction scheme which make it difficult for most organizations to operate at the cost efficient frontier. The study aims to critically examine and evaluate the application of cost control and cost reduction in organization performance and also to review budget as an effective tool of cost control and cost reduction. A descriptive survey research was adopted. A total number of 40 respondent retrieved were used for the study. The analysis of data collected was undertaken by applying appropriate statistical tools. Regression analysis was used to test the hypothesis with the use of SPSS. Based on the findings; it was evident that cost control has a positive impact on organizational performance and also the style of management has a positive impact on organizational performance.Keywords: organization, cost reduction, cost control, performance, budget, profit
Procedia PDF Downloads 60330027 Creating Growth and Reducing Inequality in Developing Countries
Authors: Rob Waddle
Abstract:
We study an economy with weak justice and security systems and with weak public policy and regulation or little capacity to implement them, and with high barriers to profitable sectors. We look at growth and development opportunities based on the derived demand. We show that there is hope for such an economy to grow up and to generate a win-win situation for all stakeholders if the derived demand is supplied. We then investigate conditions that could stimulate the derived demand supply. We show that little knowledge of public, private and international expenditures in the economy and academic tools are enough to trigger the derived demand supply. Our model can serve as guidance to donor and NGO working in developing countries, and show to media the best way to help is to share information about existing and accessible opportunities. It can also provide direction to vocational schools and universities that could focus more on providing tools to seize existing opportunities.Keywords: growth, development, monopoly, oligopoly, inequality
Procedia PDF Downloads 33530026 Retrofitting of Asymmetric Steel Structure Equipped with Tuned Liquid Column Dampers by Nonlinear Finite Element Modeling
Authors: A. Akbarpour, M. R. Adib Ramezani, M. Zhian, N. Ghorbani Amirabad
Abstract:
One way to improve the performance of structures against of earthquake is passive control which requires no external power source. In this research, tuned liquid column dampers which are among of systems with the capability to transfer energy between various modes of vibration, are used. For the first time, a liquid column damper for vibration control structure is presented. After modeling this structure in design building software and performing the static and dynamic analysis and obtaining the necessary parameters for the design of tuned liquid column damper, the whole structure will be analyzed in finite elements software. The tuned liquid column dampers are installed on the structure and nonlinear time-history analysis is done in two cases of structures; with and without dampers. Finally the seismic behavior of building in the two cases will be examined. In this study the nonlinear time-history analysis on a twelve-story steel structure equipped with damper subject to records of earthquake including Loma Prieta, Northridge, Imperiall Valley, Pertrolia and Landers was performed. The results of comparing between two cases show that these dampers have reduced lateral displacement and acceleration of levels on average of 10%. Roof displacement and acceleration also reduced respectively 5% and 12%. Due to structural asymmetric in the plan, the maximum displacements of surrounding structures as well as twisting were studied. The results show that the dampers lead to a 10% reduction in the maximum response of structure stories surrounding points. At the same time, placing the dampers, caused to reduce twisting on the floor plan of the structure, Base shear of structure in the different earthquakes also has been reduced on the average of 6%.Keywords: retrofitting, passive control, tuned liquid column damper, finite element analysis
Procedia PDF Downloads 41430025 Efficient Estimation of Maximum Theoretical Productivity from Batch Cultures via Dynamic Optimization of Flux Balance Models
Authors: Peter C. St. John, Michael F. Crowley, Yannick J. Bomble
Abstract:
Production of chemicals from engineered organisms in a batch culture typically involves a trade-off between productivity, yield, and titer. However, strategies for strain design typically involve designing mutations to achieve the highest yield possible while maintaining growth viability. Such approaches tend to follow the principle of designing static networks with minimum metabolic functionality to achieve desired yields. While these methods are computationally tractable, optimum productivity is likely achieved by a dynamic strategy, in which intracellular fluxes change their distribution over time. One can use multi-stage fermentations to increase either productivity or yield. Such strategies would range from simple manipulations (aerobic growth phase, anaerobic production phase), to more complex genetic toggle switches. Additionally, some computational methods can also be developed to aid in optimizing two-stage fermentation systems. One can assume an initial control strategy (i.e., a single reaction target) in maximizing productivity - but it is unclear how close this productivity would come to a global optimum. The calculation of maximum theoretical yield in metabolic engineering can help guide strain and pathway selection for static strain design efforts. Here, we present a method for the calculation of a maximum theoretical productivity of a batch culture system. This method follows the traditional assumptions of dynamic flux balance analysis: that internal metabolite fluxes are governed by a pseudo-steady state and external metabolite fluxes are represented by dynamic system including Michealis-Menten or hill-type regulation. The productivity optimization is achieved via dynamic programming, and accounts explicitly for an arbitrary number of fermentation stages and flux variable changes. We have applied our method to succinate production in two common microbial hosts: E. coli and A. succinogenes. The method can be further extended to calculate the complete productivity versus yield Pareto surface. Our results demonstrate that nearly optimal yields and productivities can indeed be achieved with only two discrete flux stages.Keywords: A. succinogenes, E. coli, metabolic engineering, metabolite fluxes, multi-stage fermentations, succinate
Procedia PDF Downloads 21530024 Developing an Effectual Logic through a Visual Mind Mapping
Authors: Alberti Pascal, Mustapha Mouloua
Abstract:
Companies are confronted with complex and competitive markets. The dynamics of these markets are becoming more and more fluid, requiring companies to provide competitive, definite and technological responses within increasingly short timeframes. To meet this demand, companies must rely on the cognitive abilities of actors of creativity to provide tangible answers to current contextual problems. It therefore seems appropriate to provide instruments to support this particular stage of innovation. Various methods and tools can meet this requirement. For a number of years we have been conducting experiments on the use of mind maps in the context of innovation projects with teams of different nationalities. After presenting the main research carried out on this theme, we discuss the possible correlation between the different uses of iconic tools and certain types of innovation. We then provide a link with different cognitive logic. Finally, we conclude by putting our research into perspective.Keywords: creativity, innovation, causal logic, effectual logic, mind mapping
Procedia PDF Downloads 43230023 Multi-Criteria Decision Making Tool for Assessment of Biorefinery Strategies
Authors: Marzouk Benali, Jawad Jeaidi, Behrang Mansoornejad, Olumoye Ajao, Banafsheh Gilani, Nima Ghavidel Mehr
Abstract:
Canadian forest industry is seeking to identify and implement transformational strategies for enhanced financial performance through the emerging bioeconomy or more specifically through the concept of the biorefinery. For example, processing forest residues or surplus of biomass available on the mill sites for the production of biofuels, biochemicals and/or biomaterials is one of the attractive strategies along with traditional wood and paper products and cogenerated energy. There are many possible process-product biorefinery pathways, each associated with specific product portfolios with different levels of risk. Thus, it is not obvious which unique strategy forest industry should select and implement. Therefore, there is a need for analytical and design tools that enable evaluating biorefinery strategies based on a set of criteria considering a perspective of sustainability over the short and long terms, while selecting the existing core products as well as selecting the new product portfolio. In addition, it is critical to assess the manufacturing flexibility to internalize the risk from market price volatility of each targeted bio-based product in the product portfolio, prior to invest heavily in any biorefinery strategy. The proposed paper will focus on introducing a systematic methodology for designing integrated biorefineries using process systems engineering tools as well as a multi-criteria decision making framework to put forward the most effective biorefinery strategies that fulfill the needs of the forest industry. Topics to be covered will include market analysis, techno-economic assessment, cost accounting, energy integration analysis, life cycle assessment and supply chain analysis. This will be followed by describing the vision as well as the key features and functionalities of the I-BIOREF software platform, developed by CanmetENERGY of Natural Resources Canada. Two industrial case studies will be presented to support the robustness and flexibility of I-BIOREF software platform: i) An integrated Canadian Kraft pulp mill with lignin recovery process (namely, LignoBoost™); ii) A standalone biorefinery based on ethanol-organosolv process.Keywords: biorefinery strategies, bioproducts, co-production, multi-criteria decision making, tool
Procedia PDF Downloads 23230022 Modern Agriculture and Industrialization Nexus in the Nigerian Context
Authors: Ese Urhie, Olabisi Popoola, Obindah Gershon, Olabanji Ewetan
Abstract:
Modern agriculture involves the use of improved tools and equipment (instead of crude and ineffective tools) like tractors, hand operated planters, hand operated fertilizer drills and combined harvesters - which increase agricultural productivity. Farmers in Nigeria still have huge potentials to enhance their productivity. The study argues that the increase in agricultural output due to increased productivity, orchestrated by modern agriculture will promote forward linkages and opportunities in the processing sub-sector; both the manufacturing of machines and the processing of raw materials. Depending on existing incentives, foreign investment could be attracted to augment local investment in the sector. The availability of raw materials in large quantity – which prices are competitive – will attract investment in other industries. In addition, potentials for backward linkages will also be created. In a nutshell, adopting the unbalanced growth theory in favour of the agricultural sector could engender industrialization in a country with untapped potentials. The paper highlights the numerous potentials of modern agriculture that are yet to be tapped in Nigeria and also provides a theoretical analysis of how the realization of such potentials could promote industrialization in the country. The study adopts the Lewis’ theory of structural–change model and Hirschman’s theory of unbalanced growth in the design of the analytical framework. The framework will be useful in empirical studies that will guide policy formulation.Keywords: modern agriculture, industrialization, structural change model, unbalanced growth
Procedia PDF Downloads 30430021 Error Analysis in Academic Writing of EFL Learners: A Case Study for Undergraduate Students at Pathein University
Authors: Aye Pa Pa Myo
Abstract:
Writing in English is accounted as a complex process for English as a foreign language learners. Besides, committing errors in writing can be found as an inevitable part of language learners’ writing. Generally, academic writing is quite difficult for most of the students to manage for getting better scores. Students can commit common errors in their writings when they try to write academic writing. Error analysis deals with identifying and detecting the errors and also explains the reason for the occurrence of these errors. In this paper, the researcher has an attempt to examine the common errors of undergraduate students in their academic writings at Pathein University. The purpose of doing this research is to investigate the errors which students usually commit in academic writing and to find out the better ways for correcting these errors in EFL classrooms. In this research, fifty-third-year non-English specialization students attending Pathein University were selected as participants. This research took one month. It was conducted with a mixed methodology method. Two mini-tests were used as research tools. Data were collected with a quantitative research method. Findings from this research pointed that most of the students noticed their common errors after getting the necessary input, and they became more decreased committing these errors after taking mini-test; hence, all findings will be supportive for further researches related to error analysis in academic writing.Keywords: academic writing, error analysis, EFL learners, mini-tests, mixed methodology
Procedia PDF Downloads 13230020 Efforts to Revitalize Piipaash Language: An Explorative Study to Develop Culturally Appropriate and Contextually Relevant Teaching Materials for Preschoolers
Authors: Shahzadi Laibah Burq, Gina Scarpete Walters
Abstract:
Piipaash, representing one large family of North American languages, Yuman, is reported as one of the seriously endangered languages in the Salt River Pima-Maricopa Indian Community of Arizona. In a collaborative venture between Arizona State University (ASU) and Salt River Pima-Maricopa Indian Community (SRPMIC), efforts have been made to revitalize and preserve the Piipaash language and its cultural heritage. The present study is one example of several other language documentation and revitalization initiatives that Humanities Lab ASU has taken. This study was approved to receive a “Beyond the lab” grant after the researchers successfully created a Teaching Guide for Early Childhood Piipaash storybook during their time working in the Humanities Lab. The current research is an extension of the previous project and focuses on creating customized teaching materials and tools for the teachers and parents of the students of the Early Enrichment Program at SRPMIC. However, to determine and maximize the usefulness of the teaching materials with regards to their reliability, validity, and practicality in the given context, this research aims to conduct Environmental Analysis and Need Analysis. Environmental Analysis seeks to evaluate the Early Enrichment Program situation and Need Analysis to investigate the specific and situated requirements of the teachers to assist students in building target language skills. The study employs a qualitative methods approach for the collection of the data. Multiple data collection strategies are used concurrently to gather information from the participants. The research tools include semi-structured interviews with the program administrators and teachers, classroom observations, and teacher shadowing. The researchers utilize triangulation of the data to maintain validity in the process of data interpretation. The preliminary results of the study show a need for culturally appropriate materials that can further the learning of students of the target language as well as the culture, i.e., clay pots and basket-making materials. It was found that the course and teachers focus on developing the Listening and Speaking skills of the students. Moreover, to assist the young learners beyond the classroom, the teachers could make use of send-home teaching materials to reinforce the learning (i.e., coloring books, including illustrations of culturally relevant animals, food, and places). Audio language resources are also identified as helpful additional materials for the parents to assist the learning of the kids.Keywords: indigenous education, materials development, need analysis, piipaash language revitalizaton
Procedia PDF Downloads 9030019 Enhancing Rupture Pressure Prediction for Corroded Pipes Through Finite Element Optimization
Authors: Benkouiten Imene, Chabli Ouerdia, Boutoutaou Hamid, Kadri Nesrine, Bouledroua Omar
Abstract:
Algeria is actively enhancing gas productivity by augmenting the supply flow. However, this effort has led to increased internal pressure, posing a potential risk to the pipeline's integrity, particularly in the presence of corrosion defects. Sonatrach relies on a vast network of pipelines spanning 24,000 kilometers for the transportation of gas and oil. The aging of these pipelines raises the likelihood of corrosion both internally and externally, heightening the risk of ruptures. To address this issue, a comprehensive inspection is imperative, utilizing specialized scraping tools. These advanced tools furnish a detailed assessment of all pipeline defects. It is essential to recalculate the pressure parameters to safeguard the corroded pipeline's integrity while ensuring the continuity of production. In this context, Sonatrach employs symbolic pressure limit calculations, such as ASME B31G (2009) and the modified ASME B31G (2012). The aim of this study is to perform a comparative analysis of various limit pressure calculation methods documented in the literature, namely DNV RP F-101, SHELL, P-CORRC, NETTO, and CSA Z662. This comparative assessment will be based on a dataset comprising 329 burst tests published in the literature. Ultimately, we intend to introduce a novel approach grounded in the finite element method, employing ANSYS software.Keywords: pipeline burst pressure, burst test, corrosion defect, corroded pipeline, finite element method
Procedia PDF Downloads 5830018 Age-Based Interface Design for Children’s CAPT Systems
Authors: Saratu Yusuf Ilu, Mumtaz B. Mustafa, Siti Salwah Salim, Mehdi Malekzadeh
Abstract:
Children today use computer based application in various activities especially for learning and education. Many of these tools and application such as the Computer Aided Pronunciation Training (CAPT) system enable children to explore and experience them with little supervision from the adults. In order for these tools and application to have maximum effect on the children’s learning and education, it must be attractive to the children to use them. This could be achieved with the proper user interface (UI) design. As children grow, so do their ability, taste and preferences. They interact differently with these applications as they grow older. This study reviews several articles on how age factor influences the UI design. The review focuses on age related abilities such as cognitive, literacy, concentration and feedback requirement. We have also evaluated few of existing CAPT systems and determine the influence of age-based factors on the interface design.Keywords: children, age-based interaction, learning application, age-based capability
Procedia PDF Downloads 42430017 Study of Crashworthiness Behavior of Thin-Walled Tube under Axial Loading by Using Computational Mechanics
Authors: M. Kamal M. Shah, Noorhifiantylaily Ahmad, O. Irma Wani, J. Sahari
Abstract:
This paper presents the computationally mechanics analysis of energy absorption for cylindrical and square thin wall tubed structure by using ABAQUS/explicit. The crashworthiness behavior of AISI 1020 mild steel thin-walled tube under axial loading has been studied. The influence effects of different model’s cross-section, as well as model length on the crashworthiness behavior of thin-walled tube, are investigated. The model was placed on loading platform under axial loading with impact velocity of 5 m/s to obtain the deformation results of each model under quasi-static loading. The results showed that model undergoes different deformation mode exhibits different energy absorption performance.Keywords: axial loading, computational mechanics, energy absorption performance, crashworthiness behavior, deformation mode
Procedia PDF Downloads 44130016 Improvement of Students’ Active Experience through the Provision of Foundational Architecture Pedagogy by Virtual Reality Tools
Authors: Mehdi Khakzand, Flora Fakourian
Abstract:
It has been seen in recent years that architects are using virtual modeling to help them visualize their projects. Research has indicated that virtual media, particularly virtual reality, enhances architects' comprehension of design and spatial perception. Creating a communal experience for active learning is an essential component of the design process in architecture pedagogy. It has been particularly challenging to replicate design principles as a critical teaching function, and this is a complex issue that demands comprehension. Nonetheless, the usage of simulation should be studied and limited as appropriate. In conjunction with extensive technology, 3D geometric illustration can bridge the gap between the real and virtual worlds. This research intends to deliver a pedagogical experience in the architecture basics course to improve the architectural design process utilizing virtual reality tools. This tool seeks to tackle current challenges in current ways of architectural illustration by offering building geometry illustration, building information (data from the building information model), and simulation results. These tools were tested over three days in a design workshop with 12 architectural students. This article provided an architectural VR-based course and explored its application in boosting students' active experiences. According to the research, this technology can improve students' cognitive skills from challenging simulations by boosting visual understanding.Keywords: active experience, architecture pedagogy, virtual reality, spatial perception
Procedia PDF Downloads 8730015 In Patribus Fidelium Leftist Discourses on Political Violence in Lebanon and Algeria: A Critical Discourse Analysis
Authors: Mehdi Heydari Sanglaji
Abstract:
The dramatic events of the 11 September, and their tragic repercussions, catapulted issues of the political violence in and from the ‘Muslim world’ onto the political discourse, be it in patriotic speeches of campaigning politicians or the TV and news punditry. Depending on what end of the political spectrum the politician/pundit pledges fealty to, the overall analyses of political violence in the West Asia and North Africa (WANA) tends towards two overarching categories: on the Right, the diagnosis has unanimously been, ‘they must hate our freedom.’ On the Left, however, there is the contention that the West has to be counted as the primary cause of such rage, for the years of plundering of lives and resources, through colonialism, the Cold War, coups, etc. All these analyses are premised on at least two presuppositions: the violence in and from the WANA region a) is always reactionary, in the sense that it happens only in response to something the West is or does; and b) must always already be condemned, as it is essentially immoral and wrong. It is the aim of this paper to challenge such viewpoints. Through a rigorous study of the historical discourses on political violence in the Leftist organizations active in Algeria and Lebanon, we claim there is a myriad of diverse reasons and justifications presented for advocating political violence in these countries that defy facile categorization. Inspecting such rhetoric for inciting political violence in Leftist discourses, and how some of these reasonings have percolated into other movements in the region (e.g., Islamist ones), will reveal a wealth of indigenous discourses on the subject that has been largely neglected by the Western Media punditry and even by the academia. The indigenous discourses on political violence, much of which overlaps with emancipatory projects in the region, partly follow grammar and logic, which may be different from those developed in the West, even by its more critical theories. Understanding so different epistemology of violence, and the diverse contexts in which political violence might be justifiable in the mind of ‘the other,’ necessitates a historical, materialist, and genealogical study of the discourse already in practice in the WANA region. In that regard, both critical terrorism studies and critical discourse analysis provide exemplary tools of analysis. Capitalizing on such tools, this project will focus on unearthing a history of thought that renders moot the reduction of all instances of violence in the region to an Islamic culture or imperialism/colonialism. The main argument in our research is that by studying the indigenous discourses on political violence, we will be far more equipped in understanding the reasons and the possible solutions for acts of terrorism in and from the region.Keywords: political violence, terrorism, leftist organizations, West Asia/North Africa
Procedia PDF Downloads 13130014 Measuring the Resilience of e-Governments Using an Ontology
Authors: Onyekachi Onwudike, Russell Lock, Iain Phillips
Abstract:
The variability that exists across governments, her departments and the provisioning of services has been areas of concern in the E-Government domain. There is a need for reuse and integration across government departments which are accompanied by varying degrees of risks and threats. There is also the need for assessment, prevention, preparation, response and recovery when dealing with these risks or threats. The ability of a government to cope with the emerging changes that occur within it is known as resilience. In order to forge ahead with concerted efforts to manage reuse and integration induced risks or threats to governments, the ambiguities contained within resilience must be addressed. Enhancing resilience in the E-Government domain is synonymous with reducing risks governments face with provisioning of services as well as reuse of components across departments. Therefore, it can be said that resilience is responsible for the reduction in government’s vulnerability to changes. In this paper, we present the use of the ontology to measure the resilience of governments. This ontology is made up of a well-defined construct for the taxonomy of resilience. A specific class known as ‘Resilience Requirements’ is added to the ontology. This class embraces the concept of resilience into the E-Government domain ontology. Considering that the E-Government domain is a highly complex one made up of different departments offering different services, the reliability and resilience of the E-Government domain have become more complex and critical to understand. We present questions that can help a government access how prepared they are in the face of risks and what steps can be taken to recover from them. These questions can be asked with the use of queries. The ontology focuses on developing a case study section that is used to explore ways in which government departments can become resilient to the different kinds of risks and threats they may face. A collection of resilience tools and resources have been developed in our ontology to encourage governments to take steps to prepare for emergencies and risks that a government may face with the integration of departments and reuse of components across government departments. To achieve this, the ontology has been extended by rules. We present two tools for understanding resilience in the E-Government domain as a risk analysis target and the output of these tools when applied to resilience in the E-Government domain. We introduce the classification of resilience using the defined taxonomy and modelling of existent relationships based on the defined taxonomy. The ontology is constructed on formal theory and it provides a semantic reference framework for the concept of resilience. Key terms which fall under the purview of resilience with respect to E-Governments are defined. Terms are made explicit and the relationships that exist between risks and resilience are made explicit. The overall aim of the ontology is to use it within standards that would be followed by all governments for government-based resilience measures.Keywords: E-Government, Ontology, Relationships, Resilience, Risks, Threats
Procedia PDF Downloads 33730013 Analysis of Matching Pursuit Features of EEG Signal for Mental Tasks Classification
Authors: Zin Mar Lwin
Abstract:
Brain Computer Interface (BCI) Systems have developed for people who suffer from severe motor disabilities and challenging to communicate with their environment. BCI allows them for communication by a non-muscular way. For communication between human and computer, BCI uses a type of signal called Electroencephalogram (EEG) signal which is recorded from the human„s brain by means of an electrode. The electroencephalogram (EEG) signal is an important information source for knowing brain processes for the non-invasive BCI. Translating human‟s thought, it needs to classify acquired EEG signal accurately. This paper proposed a typical EEG signal classification system which experiments the Dataset from “Purdue University.” Independent Component Analysis (ICA) method via EEGLab Tools for removing artifacts which are caused by eye blinks. For features extraction, the Time and Frequency features of non-stationary EEG signals are extracted by Matching Pursuit (MP) algorithm. The classification of one of five mental tasks is performed by Multi_Class Support Vector Machine (SVM). For SVMs, the comparisons have been carried out for both 1-against-1 and 1-against-all methods. Procedia PDF Downloads 27830012 The Impact of Using Microlearning to Enhance Students' Programming Skills and Learning Motivation
Authors: Ali Alqarni
Abstract:
This study aims to explore the impact of microlearning on the development of the programming skills as well as on the motivation for learning of first-year high schoolers in Jeddah. The sample consists of 78 students, distributed as 40 students in the control group, and 38 students in the treatment group. The quasi-experimental method, which is a type of quantitative method, was used in this study. In addition to the technological tools used to create and deliver the digital content, the study utilized two tools to collect the data: first, an observation card containing a list of programming skills, and second, a tool to measure the student's motivation for learning. The findings indicate that microlearning positively impacts programming skills and learning motivation for students. The study, then, recommends implementing and expanding the use of microlearning in educational contexts both in the general education level and the higher education level.Keywords: educational technology, teaching strategies, online learning, microlearning
Procedia PDF Downloads 12830011 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure
Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer
Abstract:
The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition
Procedia PDF Downloads 10830010 Improving Law Enforcement Strategies Through Geographic Information Systems: A Spatio-Temporal Analysis of Antisocial Activities in Móstoles (2022)
Authors: Daniel Suarez Alonso
Abstract:
This study has tried to focus on the alternatives offered to police institutions by the implementation of Geographic Information systems. Providing operational police commanders with effective and efficient tools, providing analytical capacity to reduce criminal opportunities, must be a priority. Given the intimate connection of crimes and infractions to the environment, law enforcement institutions must respond proactively to changing circumstances of anti-norm behaviors. To this end, it has been intended to analyze the antisocial spatial distribution of the city of Móstoles, trying to identify those spatiotemporal patterns that occur to anticipate their commission through the planning of dynamic preventive strategies. The application of GIS offers alternative analytical approaches to the different problems that underlie the development of life in society, focusing resources on those places with the highest concentration of incidents.Keywords: data analysis, police organizations, police prevention, geographic information systems
Procedia PDF Downloads 5030009 Audit Is a Production Performance Tool
Authors: Lattari Samir
Abstract:
The performance of a production process is the result of proper operation where the management tools appear as the key to success through process management which consists of managing and implementing a quality policy, organizing and planning the manufacturing, and thus defining an efficient logic as the main areas covered by production management. To carry out this delicate mission, which requires reconciling often contradictory objectives, the auditor is called upon, who must be able to express an opinion on the effectiveness of the operation of the "production" function. To do this, the auditor must structure his mission in three phases, namely, the preparation phase to assimilate the particularities of this function, the implementation phase and the conclusion phase. The audit is a systematic and independent examination of all the stages of a manufacturing process intended to determine whether the pre-established arrangements for the combination of production factors are respected, whether their implementation is effective and whether they are relevant in relation to the goals.Keywords: audit, performance of process, independent examination, management tools, audit of accounts
Procedia PDF Downloads 7530008 'Systems' and Its Impact on Virtual Teams and Electronic Learning
Authors: Shavindrie Cooray
Abstract:
It is vital that students are supported in having balanced conversations about topics that might be controversial. This process is crucial to the development of critical thinking skills. This can be difficult to attain in e-learning environments, with some research finding students report a perceived loss in the quality of knowledge exchange and performance. This research investigated if Systems Theory could be applied to structure the discussion, improve information sharing, and reduce conflicts when students are working in online environments. This research involved 160 participants across four categories of student groups at a college in the Northeastern US. Each group was provided with a shared problem, and each group was expected to make a proposal for a solution. Two groups worked face-to-face; the first face to face group engaged with the problem and each other with no intervention from a facilitator; a second face to face group worked on the problem using Systems tools to facilitate problem structuring, group discussion, and decision-making. There were two types of virtual teams. The first virtual group also used Systems tools to facilitate problem structuring and group discussion. However, all interactions were conducted in a synchronous virtual environment. The second type of virtual team also met in real time but worked with no intervention. Findings from the study demonstrated that the teams (both virtual and face-to-face) using Systems tools shared more information with each other than the other teams; additionally, these teams reported an increased level of disagreement amongst their members, but also expressed more confidence and satisfaction with the experience and resulting decision compared to the other groups.Keywords: e-learning, virtual teams, systems approach, conflicts
Procedia PDF Downloads 13730007 AI Applications in Accounting: Transforming Finance with Technology
Authors: Alireza Karimi
Abstract:
Artificial Intelligence (AI) is reshaping various industries, and accounting is no exception. With the ability to process vast amounts of data quickly and accurately, AI is revolutionizing how financial professionals manage, analyze, and report financial information. In this article, we will explore the diverse applications of AI in accounting and its profound impact on the field. Automation of Repetitive Tasks: One of the most significant contributions of AI in accounting is automating repetitive tasks. AI-powered software can handle data entry, invoice processing, and reconciliation with minimal human intervention. This not only saves time but also reduces the risk of errors, leading to more accurate financial records. Pattern Recognition and Anomaly Detection: AI algorithms excel at pattern recognition. In accounting, this capability is leveraged to identify unusual patterns in financial data that might indicate fraud or errors. AI can swiftly detect discrepancies, enabling auditors and accountants to focus on resolving issues rather than hunting for them. Real-Time Financial Insights: AI-driven tools, using natural language processing and computer vision, can process documents faster than ever. This enables organizations to have real-time insights into their financial status, empowering decision-makers with up-to-date information for strategic planning. Fraud Detection and Prevention: AI is a powerful tool in the fight against financial fraud. It can analyze vast transaction datasets, flagging suspicious activities and reducing the likelihood of financial misconduct going unnoticed. This proactive approach safeguards a company's financial integrity. Enhanced Data Analysis and Forecasting: Machine learning, a subset of AI, is used for data analysis and forecasting. By examining historical financial data, AI models can provide forecasts and insights, aiding businesses in making informed financial decisions and optimizing their financial strategies. Artificial Intelligence is fundamentally transforming the accounting profession. From automating mundane tasks to enhancing data analysis and fraud detection, AI is making financial processes more efficient, accurate, and insightful. As AI continues to evolve, its role in accounting will only become more significant, offering accountants and finance professionals powerful tools to navigate the complexities of modern finance. Embracing AI in accounting is not just a trend; it's a necessity for staying competitive in the evolving financial landscape.Keywords: artificial intelligence, accounting automation, financial analysis, fraud detection, machine learning in finance
Procedia PDF Downloads 6330006 Mastering Test Automation: Bridging Gaps for Seamless QA
Authors: Rohit Khankhoje
Abstract:
The rapid evolution of software development practices has given rise to an increasing demand for efficient and effective test automation. The paper titled "Mastering Test Automation: Bridging Gaps for Seamless QA" delves into the crucial aspects of test automation, addressing the obstacles faced by organizations in achieving flawless quality assurance. The paper highlights the importance of bridging knowledge gaps within organizations, emphasizing the necessity for management to acquire a deeper comprehension of test automation scenarios, coverage, report trends, and the importance of communication. To tackle these challenges, this paper introduces innovative solutions, including the development of an automation framework that seamlessly integrates with test cases and reporting tools like TestRail and Jira. This integration facilitates the automatic recording of bugs in Jira, enhancing bug reporting and communication between manual QA and automation teams as well as TestRail have all newly added automated testcases as soon as it is part of the automation suite. The paper demonstrates how this framework empowers management by providing clear insights into ongoing automation activities, bug origins, trend analysis, and test case specifics. "Mastering Test Automation" serves as a comprehensive guide for organizations aiming to enhance their quality assurance processes through effective test automation. It not only identifies the common pitfalls and challenges but also offers practical solutions to bridge the gaps, resulting in a more streamlined and efficient QA process.Keywords: automation framework, API integration, test automation, test management tools
Procedia PDF Downloads 73