Search results for: classroom friendly approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15764

Search results for: classroom friendly approach

12614 New Approach for Load Modeling

Authors: Slim Chokri

Abstract:

Load forecasting is one of the central functions in power systems operations. Electricity cannot be stored, which means that for electric utility, the estimate of the future demand is necessary in managing the production and purchasing in an economically reasonable way. A majority of the recently reported approaches are based on neural network. The attraction of the methods lies in the assumption that neural networks are able to learn properties of the load. However, the development of the methods is not finished, and the lack of comparative results on different model variations is a problem. This paper presents a new approach in order to predict the Tunisia daily peak load. The proposed method employs a computational intelligence scheme based on the Fuzzy neural network (FNN) and support vector regression (SVR). Experimental results obtained indicate that our proposed FNN-SVR technique gives significantly good prediction accuracy compared to some classical techniques.

Keywords: neural network, load forecasting, fuzzy inference, machine learning, fuzzy modeling and rule extraction, support vector regression

Procedia PDF Downloads 430
12613 The Impact of Built Environment Design on Users’ Psychology to Foster Pro-Environmental Behavior in University Open Spaces

Authors: Rehab Mahmoud El Sayed, Toka Fahmy Nasr, Dalia M. Rasmi

Abstract:

Environmental psychology studies the interaction between the user and the environment. This field is crucial in understanding how the built environment affects human behaviour, moods and feelings. Studying and understanding the aspects and influences of environmental psychology is a crucial key to investigating how the design can influence human behaviour to be environmentally friendly. This is known as pro-environmental behaviour where human actions are sustainable and impacts the environment positively. Accordingly, this paper aims to explore the impact of built environment design on environmental psychology to foster pro-environmental behaviour in university campus open spaces. In order to achieve this, an exploratory research method was conducted where a detailed study of the influences of environmental psychology was done and clarified its elements. Moreover, investigating the impact of design elements on human psychology took place. Besides, an empirical study of the outdoor spaces of the British University in Egypt occurred and a survey for students and staff was distributed. The research concluded that the four main psychological aspects are mostly influenced by the following design elements colours, lighting and thermal comfort respectively. Additionally, focusing on these design elements in the design process will create a sustainable environment. As a consequence, the pro-environmental behaviour of the user will be fostered.

Keywords: environmental psychology, pro-environmental behavior, sustainable environment, psychological influences

Procedia PDF Downloads 81
12612 Impact Assessment of Lean Practices on Social Sustainability Indicators: An Approach Using ISM Method

Authors: Aline F. Marcon, Eduardo F. da Silva, Marina Bouzon

Abstract:

The impact of lean management on environmental sustainability is the research line that receives the most attention from academicians. Therefore, the social dimension of sustainable development has so far received less attention. This paper aims to evaluate the impact of intra-plant lean manufacturing practices on social sustainability indicators extracted from the Global Reporting Initiative (GRI) parameters. The method is two-phased, including MCDM approach to uncover the most relevant practices regarding social performance and Interpretive Structural Modeling (ISM) method to reveal the structural relationship among lean practices. Professionals from the academic and industrial fields answered the questionnaires. From the results of this paper, it is possible to verify that practices such as “Safety Improvement Programs”, “Total Quality Management” and “Cross-functional Workforce” are the ones which have the most positive influence on the set of GRI social indicators.

Keywords: indicators, ISM, lean, social, sustainability

Procedia PDF Downloads 140
12611 Development of Industry Sector Specific Factory Standards

Authors: Peter Burggräf, Moritz Krunke, Hanno Voet

Abstract:

Due to shortening product and technology lifecycles, many companies use standardization approaches in product development and factory planning to reduce costs and time to market. Unlike large companies, where modular systems are already widely used, small and medium-sized companies often show a much lower degree of standardization due to lower scale effects and missing capacities for the development of these standards. To overcome these challenges, the development of industry sector specific standards in cooperations or by third parties is an interesting approach. This paper analyzes which branches that are mainly dominated by small or medium-sized companies might be especially interesting for the development of factory standards using the example of the German industry. For this, a key performance indicator based approach was developed that will be presented in detail with its specific results for the German industry structure.

Keywords: factory planning, factory standards, industry sector specific standardization, production planning

Procedia PDF Downloads 389
12610 Design Optimization of Miniature Mechanical Drive Systems Using Tolerance Analysis Approach

Authors: Eric Mxolisi Mkhondo

Abstract:

Geometrical deviations and interaction of mechanical parts influences the performance of miniature systems.These deviations tend to cause costly problems during assembly due to imperfections of components, which are invisible to a naked eye.They also tend to cause unsatisfactory performance during operation due to deformation cause by environmental conditions.One of the effective tools to manage the deviations and interaction of parts in the system is tolerance analysis.This is a quantitative tool for predicting the tolerance variations which are defined during the design process.Traditional tolerance analysis assumes that the assembly is static and the deviations come from the manufacturing discrepancies, overlooking the functionality of the whole system and deformation of parts due to effect of environmental conditions. This paper presents an integrated tolerance analysis approach for miniature system in operation.In this approach, a computer-aided design (CAD) model is developed from system’s specification.The CAD model is then used to specify the geometrical and dimensional tolerance limits (upper and lower limits) that vary component’s geometries and sizes while conforming to functional requirements.Worst-case tolerances are analyzed to determine the influenced of dimensional changes due to effects of operating temperatures.The method is used to evaluate the nominal conditions, and worse case conditions in maximum and minimum dimensions of assembled components.These three conditions will be evaluated under specific operating temperatures (-40°C,-18°C, 4°C, 26°C, 48°C, and 70°C). A case study on the mechanism of a zoom lens system is used to illustrate the effectiveness of the methodology.

Keywords: geometric dimensioning, tolerance analysis, worst-case analysis, zoom lens mechanism

Procedia PDF Downloads 162
12609 Gender Differences in Adolescent Avatars: Gender Consistency and Masculinity-Femininity of Nicknames and Characters

Authors: Monika Paleczna, Małgorzata Holda

Abstract:

Choosing an avatar's gender in a computer game is one of the key elements in the process of creating an online identity. The selection of a male or female avatar can define the entirety of subsequent decisions regarding both appearance and behavior. However, when the most popular games available for the Nintendo console in 1998 were analyzed, it turned out that 41% of computer games did not have female characters. Nowadays, players create their avatars based mainly on binary gender classification, with male and female characters to choose from. The main aim of the poster is to explore gender differences in adolescent avatars. 130 adolescents aged 15-17 participated in the study. They created their avatars and then played a computer game. The creation of the avatar was based on the choice of gender, then physical and mental characteristics. Data on gender consistency (consistency between participant’s sex and gender selected for the avatar) and masculinity-femininity of avatar nicknames and appearance will be presented. The masculinity-femininity of avatar nicknames and appearance was assessed by expert raters on a very masculine to very feminine scale. Additionally, data on the relationships of the perceived levels of masculinity-femininity with hostility-friendliness and the intelligence of avatars will be shown. The dimensions of hostility-friendliness and intelligence were also assessed by expert raters on scales ranging from very hostile to very friendly and from very low intelligence to very high intelligence.

Keywords: gender, avatar, adolescence, computer games

Procedia PDF Downloads 210
12608 Clustered Regularly Interspaced Short Palindromic Repeats Interference (CRISPRi): An Approach to Inhibit Microbial Biofilm

Authors: Azna Zuberi

Abstract:

Biofilm is a sessile bacterial accretion in which bacteria adapts different physiological and morphological behavior from planktonic form. It is the root cause of about 80% microbial infections in human. Among them, E. coli biofilms are most prevalent in medical devices associated nosocomial infections. The objective of this study was to inhibit biofilm formation by targeting LuxS gene, involved in quorum sensing using CRISPRi. luxS is a synthase, involved in the synthesis of Autoinducer-2(AI-2), which in turn guides the initial stage of biofilm formation. To implement CRISPRi system, we have synthesized complementary sgRNA to target gene sequence and co-expressed with dCas9. Suppression of luxS was confirmed through qRT-PCR. The effect of luxS gene on biofilm inhibition was studied through crystal violet assay, XTT reduction assay and scanning electron microscopy. We conclude that CRISPRi system could be a potential strategy to inhibit bacterial biofilm through mechanism base approach.

Keywords: biofilm, CRISPRi, luxS, microbial

Procedia PDF Downloads 180
12607 Community Forest Management Practice in Nepal: Public Understanding of Forest Benefit

Authors: Chandralal Shrestha

Abstract:

In the developing countries like Nepal, the community based forest management approach has often been glorified as one of the best forest management alternatives to maximize the forest benefits. Though the approach has succeeded to construct a local level institution and conserve the forest biodiversity, how the local communities perceived about the forest benefits, the question always remains silent among the researchers and policy makers. The paper aims to explore the understanding of forest benefits from the perspective of local communities who used the forests in terms of institutional stability, equity and livelihood opportunity, and ecological stability. The paper revealed that the local communities have mixed understanding over the forest benefits. The institutional and ecological activities carried out by the local communities indicated that they have better understanding over the forest benefits. However, inequality while sharing the forest benefits, low pricing strategy and its negative consequences in valuation of forest products and limited livelihood opportunities indicated the poor understanding.

Keywords: community based forest management, forest benefits, lowland, Nepal

Procedia PDF Downloads 306
12606 Application to Molecular Electronics of Thin Layers of Organic Materials

Authors: M. I. Benamrani, H. Benamrani

Abstract:

In the research to replace silicon and other thin-film semiconductor technologies and to develop long-term technology that is environmentally friendly, low-cost, and abundant, there is growing interest today given to organic materials. Our objective is to prepare polymeric layers containing metal particles deposited on a surface of semiconductor material which can have better electrical properties and which could be applied in the fields of nanotechnology as an alternative to the existing processes involved in the design of electronic circuits. This work consists in the development of composite materials by complexation and electroreduction of copper in a film of poly (pyrrole benzoic acid). The deposition of the polymer film on a monocrystalline silicon substrate is made by electrochemical oxidation in an organic medium. The incorporation of copper particles into the polymer is achieved by dipping the electrode in a solution of copper sulphate to complex the cupric ions, followed by electroreduction in an aqueous solution to precipitate the copper. In order to prepare the monocrystalline silicon substrate as an electrode for electrodeposition, an in-depth study on its surface state was carried out using photoacoustic spectroscopy. An analysis of the optical properties using this technique on the effect of pickling using a chemical solution was carried out. Transmission-photoacoustic and impedance spectroscopic techniques give results in agreement with those of photoacoustic spectroscopy.

Keywords: photoacoustic, spectroscopy, copper sulphate, chemical solution

Procedia PDF Downloads 80
12605 An Algorithm of Set-Based Particle Swarm Optimization with Status Memory for Traveling Salesman Problem

Authors: Takahiro Hino, Michiharu Maeda

Abstract:

Particle swarm optimization (PSO) is an optimization approach that achieves the social model of bird flocking and fish schooling. PSO works in continuous space and can solve continuous optimization problem with high quality. Set-based particle swarm optimization (SPSO) functions in discrete space by using a set. SPSO can solve combinatorial optimization problem with high quality and is successful to apply to the large-scale problem. In this paper, we present an algorithm of SPSO with status memory to decide the position based on the previous position for solving traveling salesman problem (TSP). In order to show the effectiveness of our approach. We examine SPSOSM for TSP compared to the existing algorithms.

Keywords: combinatorial optimization problems, particle swarm optimization, set-based particle swarm optimization, traveling salesman problem

Procedia PDF Downloads 544
12604 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 162
12603 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 150
12602 Looking for a Connection between Oceanic Regions with Trends in Evaporation with Continental Ones with Trends in Precipitation through a Lagrangian Approach

Authors: Raquel Nieto, Marta Vázquez, Anita Drumond, Luis Gimeno

Abstract:

One of the hot spots of climate change is the increment of ocean evaporation. The best estimation of evaporation, OAFlux data, shows strong increasing trends in evaporation from the oceans since 1978, with peaks during the hemispheric winter and strongest along the paths of the global western boundary currents and any inner Seas. The transport of moisture from oceanic sources to the continents is the connection between evaporation from the ocean and precipitation over the continents. A key question is to try to relate evaporative source regions over the oceans where trends have occurred in the last decades with their sinks over the continents to check if there have been also any trends in the precipitation amount or its characteristics. A Lagrangian approach based on FLEXPART and ERA-interim data is used to establish this connection. The analyzed period was 1980 to 2012. Results show that there is not a general pattern, but a significant agreement was found in important areas of climate interest.

Keywords: ocean evaporation, Lagrangian approaches, contiental precipitation, Europe

Procedia PDF Downloads 252
12601 Constructivism and Situational Analysis as Background for Researching Complex Phenomena: Example of Inclusion

Authors: Radim Sip, Denisa Denglerova

Abstract:

It’s impossible to capture complex phenomena, such as inclusion, with reductionism. The most common form of reductionism is the objectivist approach, where processes and relationships are reduced to entities and clearly outlined phases, with a consequent search for relationships between them. Constructivism as a paradigm and situational analysis as a methodological research portfolio represent a way to avoid the dominant objectivist approach. They work with a situation, i.e. with the essential blending of actors and their environment. Primary transactions are taking place between actors and their surroundings. Researchers create constructs based on their need to solve a problem. Concepts therefore do not describe reality, but rather a complex of real needs in relation to the available options how such needs can be met. For examination of a complex problem, corresponding methodological tools and overall design of the research are necessary. Using an original research on inclusion in the Czech Republic as an example, this contribution demonstrates that inclusion is not a substance easily described, but rather a relationship field changing its forms in response to its actors’ behaviour and current circumstances. Inclusion consists of dynamic relationship between an ideal, real circumstances and ways to achieve such ideal under the given circumstances. Such achievement has many shapes and thus cannot be captured by description of objects. It can be expressed in relationships in the situation defined by time and space. Situational analysis offers tools to examine such phenomena. It understands a situation as a complex of dynamically changing aspects and prefers relationships and positions in the given situation over a clear and final definition of actors, entities, etc. Situational analysis assumes creation of constructs as a tool for solving a problem at hand. It emphasizes the meanings that arise in the process of coordinating human actions, and the discourses through which these meanings are negotiated. Finally, it offers “cartographic tools” (situational maps, socials worlds / arenas maps, positional maps) that are able to capture the complexity in other than linear-analytical ways. This approach allows for inclusion to be described as a complex of phenomena taking place with a certain historical preference, a complex that can be overlooked if analyzed with a more traditional approach.

Keywords: constructivism, situational analysis, objective realism, reductionism, inclusion

Procedia PDF Downloads 142
12600 The Development of Packaging to Create Additional Value for Organic Rice Products of Uttaradit Province, Thailand

Authors: Juntima Pokkrong

Abstract:

The objectives of the study were to develop packaging made from rice straws left after the harvest in order to create additional value for organic rice products of Uttaradit Province and to demonstrate the technology of producing straw packaging to the community. The population was promoters of organic rice distributors, governmental organizations, consumers, and three groups of organic rice producers which are the Agriculturist Group of Khorrum Sub-district, Pichai District, Uttaradit Province; the Agriculturist Group of Wangdin Sub-district, Muang District, Uttaradit Province; and the Agriculturist Group of Wangkapi Sub-district, Muang District, Uttaradit Province. The data were collected via group discussions, and two types of questionnaires. The data acquired were then analyzed using descriptive statistic for percentage, mean, standard deviation, and content analysis. It has been found that primary packaging for one kilogram of rice requires vacuumed plastic bags made from thermoplastic or resin because they are able to preserve the quality of rice for a long time, and they are also very cheap. For secondary packaging, the making of straw paper was studied and applied. Straw paper can be used for various purposes, and in this study, it was used to create the secondary packaging models in compliance with packaging preferences acquired from the questionnaires. The models were surveyed among the population for their opinion using satisfaction questionnaires, and the result was overall highly satisfactory.

Keywords: environmentally friendly, organic rice, packaging, straw paper

Procedia PDF Downloads 242
12599 Research on the Conservation Strategy of Territorial Landscape Based on Characteristics: The Case of Fujian, China

Authors: Tingting Huang, Sha Li, Geoffrey Griffiths, Martin Lukac, Jianning Zhu

Abstract:

Territorial landscapes have experienced a gradual loss of their typical characteristics during long-term human activities. In order to protect the integrity of regional landscapes, it is necessary to characterize, evaluate and protect them in a graded manner. The study takes Fujian, China, as an example and classifies the landscape characters of the site at the regional scale, middle scale, and detailed scale. A multi-scale approach combining parametric and holistic approaches is used to classify and partition the landscape character types (LCTs) and landscape character areas (LCAs) at different scales, and a multi-element landscape assessment approach is adopted to explore the conservation strategies of the landscape character. Firstly, multiple fields and multiple elements of geography, nature and humanities were selected as the basis of assessment according to the scales. Secondly, the study takes a parametric approach to the classification and partitioning of landscape character, Principal Component Analysis, and two-stage cluster analysis (K-means and GMM) in MATLAB software to obtain LCTs, combines with Canny Operator Edge Detection Algorithm to obtain landscape character contours and corrects LCTs and LCAs by field survey and manual identification methods. Finally, the study adopts the Landscape Sensitivity Assessment method to perform landscape character conservation analysis and formulates five strategies for different LCAs: conservation, enhancement, restoration, creation, and combination. This multi-scale identification approach can efficiently integrate multiple types of landscape character elements, reduce the difficulty of broad-scale operations in the process of landscape character conservation, and provide a basis for landscape character conservation strategies. Based on the natural background and the restoration of regional characteristics, the results of landscape character assessment are scientific and objective and can provide a strong reference in regional and national scale territorial spatial planning.

Keywords: parameterization, multi-scale, landscape character identify, landscape character assessment

Procedia PDF Downloads 92
12598 Virtualization of Biomass Colonization: Potential of Application in Precision Medicine

Authors: Maria Valeria De Bonis, Gianpaolo Ruocco

Abstract:

Nowadays, computational modeling is paving new design and verification ways in a number of industrial sectors. The technology is ripe to challenge some case in the Bioengineering and Medicine frameworks: for example, looking at the strategical and ethical importance of oncology research, efforts should be made to yield new and powerful resources to tumor knowledge and understanding. With these driving motivations, we approach this gigantic problem by using some standard engineering tools such as the mathematics behind the biomass transfer. We present here some bacterial colonization studies in complex structures. As strong analogies hold with some tumor proliferation, we extend our study to a benchmark case of solid tumor. By means of a commercial software, we model biomass and energy evolution in arbitrary media. The approach will be useful to cast virtualization cases of cancer growth in human organs, while augmented reality tools will be used to yield for a realistic aid to informed decision in treatment and surgery.

Keywords: bacteria, simulation, tumor, precision medicine

Procedia PDF Downloads 330
12597 Learning Dynamic Representations of Nodes in Temporally Variant Graphs

Authors: Sandra Mitrovic, Gaurav Singh

Abstract:

In many industries, including telecommunications, churn prediction has been a topic of active research. A lot of attention has been drawn on devising the most informative features, and this area of research has gained even more focus with spread of (social) network analytics. The call detail records (CDRs) have been used to construct customer networks and extract potentially useful features. However, to the best of our knowledge, no studies including network features have yet proposed a generic way of representing network information. Instead, ad-hoc and dataset dependent solutions have been suggested. In this work, we build upon a recently presented method (node2vec) to obtain representations for nodes in observed network. The proposed approach is generic and applicable to any network and domain. Unlike node2vec, which assumes a static network, we consider a dynamic and time-evolving network. To account for this, we propose an approach that constructs the feature representation of each node by generating its node2vec representations at different timestamps, concatenating them and finally compressing using an auto-encoder-like method in order to retain reasonably long and informative feature vectors. We test the proposed method on churn prediction task in telco domain. To predict churners at timestamp ts+1, we construct training and testing datasets consisting of feature vectors from time intervals [t1, ts-1] and [t2, ts] respectively, and use traditional supervised classification models like SVM and Logistic Regression. Observed results show the effectiveness of proposed approach as compared to ad-hoc feature selection based approaches and static node2vec.

Keywords: churn prediction, dynamic networks, node2vec, auto-encoders

Procedia PDF Downloads 310
12596 Production of New Hadron States in Effective Field Theory

Authors: Qi Wu, Dian-Yong Chen, Feng-Kun Guo, Gang Li

Abstract:

In the past decade, a growing number of new hadron states have been observed, which are dubbed as XYZ states in the heavy quarkonium mass regions. In this work, we present our study on the production of some new hadron states. In particular, we investigate the processes Υ(5S,6S)→ Zb (10610)/Zb (10650)π, Bc→ Zc (3900)/Zc (4020)π and Λb→ Pc (4312)/Pc (4440)/Pc (4457)K. (1) For the production of Zb (10610)/Zb (10650) from Υ(5S,6S) decay, two types of bottom-meson loops were discussed within a nonrelativistic effective field theory. We found that the loop contributions with all intermediate states being the S-wave ground state bottom mesons are negligible, while the loops with one bottom meson being the broad B₀* or B₁' resonance could provide the dominant contributions to the Υ(5S)→ Zb⁽'⁾ π. (2) For the production of Zc (3900)/Zc (4020) from Bc decay, the branching ratios of Bc⁺→ Z (3900)⁺ π⁰ and Bc⁺→ Zc (4020)⁺ π⁰ are estimated to be of order of 10⁽⁻⁴⁾ and 10⁽⁻⁷⁾ in an effective Lagrangian approach. The large production rate of Zc (3900) could provide an important source of the production of Zc (3900) from the semi-exclusive decay of b-flavored hadrons reported by D0 Collaboration, which can be tested by the exclusive measurements in LHCb. (3) For the production of Pc (4312), Pc (4440) and Pc (4457) from Λb decay, the ratio of the branching fraction of Λb→ Pc K was predicted in a molecular scenario by using an effective Lagrangian approach, which is weakly dependent on our model parameter. We also find the ratios of the productions of the branching fractions of Λb→ Pc K and Pc→ J/ψ p can be well interpreted in the molecular scenario. Moreover, the estimated branching fractions of Λb→ Pc K are of order 10⁽⁻⁶⁾, which could be tested by further measurements in LHCb Collaboration.

Keywords: effective Lagrangian approach, hadron loops, molecular states, new hadron states

Procedia PDF Downloads 129
12595 The Case for Strategic Participation: How Facilitated Engagement Can Be Shown to Reduce Resistance and Improve Outcomes Through the Use of Strategic Models

Authors: Tony Mann

Abstract:

This paper sets out the case for involving and engaging employees/workers/stakeholders/staff in any significant change that is being considered by the senior executives of the organization. It establishes the rationale, the approach, the methodology of engagement and the benefits of a participative approach. It challenges the new norm of imposing change for fear of resistance and instead suggests that involving people has better outcomes and a longer-lasting impact. Various strategic models are introduced and illustrated to explain how the process can be most effective. The paper highlights one model in particular (the Process Iceberg® Organizational Change model) that has proven to be instrumental in developing effective change. Its use is demonstrated in its various forms and explains why so much change fails to address the key elements and how we can be more productive in managing change. ‘Participation’ in change is too often seen as negative, expensive and unwieldy. The paper aims to show that another model: UIA=O+E, can offset the difficulties and, in fact, produce much more positive and effective change.

Keywords: facilitation, stakeholders, buy-in, digital workshops

Procedia PDF Downloads 98
12594 In-service High School Teachers’ Experiences On Blended Teaching Approach Of Mathematics

Authors: Lukholo Raxangana

Abstract:

Fourth Industrial Revolution (4IR)-era teaching offers in-service mathematics teachers opportunities to use blended approaches to engage learners while teaching mathematics. This study explores in-service high school teachers' experiences with a blended teaching approach to mathematics. This qualitative case study involved eight pre-service teachers from four selected schools in the Sedibeng West District of the Gauteng Province. The study used the community of inquiry model as its analytical framework for data analysis. Data collection was through semi-structured interviews and focus-group discussions to explore in-service teachers' experiences with the influence of blended teaching (BT) on learning mathematics. The study results are the impact of load-shedding, benefits of BT, and perceptions of in-service and hindrances of BT. Based on these findings, the study recommends that further research should focus on developing data-free BT tools to assist during load-shedding, regardless of location.

Keywords: bended teaching, teachers, in-service, and mathematics

Procedia PDF Downloads 56
12593 Adsorption of Basic Dyes Using Activated Carbon Prepared from Date Palm Fibre

Authors: Riham Hazzaa , Mohamed Hussien Abd El Megid

Abstract:

Dyes are toxic and cause severe problems to aquatic environment. The use of agricultural solid wastes is considered as low-cost and eco-friendly adsorbents for removing dyes from waste water. Date palm fibre, an abundant agricultural by-product in Egypt was used to prepare activated carbon by physical activation method. This study investigates the use of date palm fiber (DPF) and activated carbon (DPFAC) for the removal of a basic dye, methylene blue (MB) from simulated waste water. The effects of temperature, pH of solution, initial dye (concentration, adsorbent dosage and contact time were studied. The experimental equilibrium adsorption data were analyzed by Langmuir, Freundlich, Temkin, Dubinin, Radushkevich and Harkins–Jura isotherms. Adsorption kinetics data were modeled using the pseudo-first and pseudo-second order and Elvoich equations. The mechanism of the adsorption process was determined from the intraparticle diffusion model. The results revealed that as the initial dye concentration , amount of adsorbent and temperature increased, the percentage of dye removal increased. The optimum pH required for maximum removal was found to be 6. The adsorption of methylene blue dye was better described by the pseudo-second-order equation. Results indicated that DPFAC and DPF could be an alternative for more costly adsorbents used for dye removal.

Keywords: adsorption, basic dye, palm fiber, activated carbon

Procedia PDF Downloads 328
12592 Usage of Military Spending, Debt Servicing and Growth for Dealing with Emergency Plan of Indian External Debt

Authors: Sahbi Farhani

Abstract:

This study investigates the relationship between external debt and military spending in case of India over the period of 1970–2012. In doing so, we have applied the structural break unit root tests to examine stationarity properties of the variables. The Auto-Regressive Distributed Lag (ARDL) bounds testing approach is used to test whether cointegration exists in presence of structural breaks stemming in the series. Our results indicate the cointegration among external debt, military spending, debt servicing, and economic growth. Moreover, military spending and debt servicing add in external debt. Economic growth helps in lowering external debt. The Vector Error Correction Model (VECM) analysis and Granger causality test reveal that military spending and economic growth cause external debt. The feedback effect also exists between external debt and debt servicing in case of India.

Keywords: external debt, military spending, ARDL approach, India

Procedia PDF Downloads 290
12591 An Investigation into the Current Implementation of Design-Build Contracts in the Kingdom of Saudi Arabia

Authors: Ibrahim A. Alhammad, Suleiman A. Al-Otaibi, Khalid S. Al-Gahtani, Naïf Al-Otaibi, Abdulaziz A. Bubshait

Abstract:

In the last decade, the use of project delivery system of design build engineering contracts is increasing in North America due to the reasons of reducing the project duration and minimizing costs. The shift from traditional approach of Design-Bid-Build to Design-Build contracts have been attributed to many factors such as evolution of the regulatory and legal frameworks governing the engineering contracts and improvement in integrating design and construction. The aforementioned practice of contracting is more appropriate in North America; yet, it may not be the case in Saudi Arabia where the traditional approach of construction contracting remains dominant. The authors believe there are number of factors related to the gaps in the level of sophistication of the engineering and management of the construction projects in both countries. A step towards improving the Saudi construction practice by adopting the new trend of construction contracting, this paper identifies the reasons why Design/Build form of contracting are not frequently utilized. A field survey, which includes the questionnaire addressing the research problem, is distributed to three main parties of the construction contracts: clients, consultants, and contractors. The analyzed collected data were statistically sufficient to finding the reasons of not adopting the new trend of good practice of deign build approach in Saudi Arabia. In addition, the reasons are: (1) lack of regulation and legal framework; (2) absence of clear criteria of the owner for the trade-off between competing contractors, (3) and lack of experience, knowledge and skill.

Keywords: design built projects, Saudi Arabia, GCC, mega projects

Procedia PDF Downloads 216
12590 Message Passing Neural Network (MPNN) Approach to Multiphase Diffusion in Reservoirs for Well Interconnection Assessments

Authors: Margarita Mayoral-Villa, J. Klapp, L. Di G. Sigalotti, J. E. V. Guzmán

Abstract:

Automated learning techniques are widely applied in the energy sector to address challenging problems from a practical point of view. To this end, we discuss the implementation of a Message Passing algorithm (MPNN)within a Graph Neural Network(GNN)to leverage the neighborhood of a set of nodes during the aggregation process. This approach enables the characterization of multiphase diffusion processes in the reservoir, such that the flow paths underlying the interconnections between multiple wells may be inferred from previously available data on flow rates and bottomhole pressures. The results thus obtained compare favorably with the predictions produced by the Reduced Order Capacitance-Resistance Models (CRM) and suggest the potential of MPNNs to enhance the robustness of the forecasts while improving the computational efficiency.

Keywords: multiphase diffusion, message passing neural network, well interconnection, interwell connectivity, graph neural network, capacitance-resistance models

Procedia PDF Downloads 143
12589 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach

Authors: Jean Berger, Nassirou Lo, Martin Noel

Abstract:

Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.

Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization

Procedia PDF Downloads 366
12588 Simplified Linear Regression Model to Quantify the Thermal Resilience of Office Buildings in Three Different Power Outage Day Times

Authors: Nagham Ismail, Djamel Ouahrani

Abstract:

Thermal resilience in the built environment reflects the building's capacity to adapt to extreme climate changes. In hot climates, power outages in office buildings pose risks to the health and productivity of workers. Therefore, it is of interest to quantify the thermal resilience of office buildings by developing a user-friendly simplified model. This simplified model begins with creating an assessment metric of thermal resilience that measures the duration between the power outage and the point at which the thermal habitability condition is compromised, considering different power interruption times (morning, noon, and afternoon). In this context, energy simulations of an office building are conducted for Qatar's summer weather by changing different parameters that are related to the (i) wall characteristics, (ii) glazing characteristics, (iii) load, (iv) orientation and (v) air leakage. The simulation results are processed using SPSS to derive linear regression equations, aiding stakeholders in evaluating the performance of commercial buildings during different power interruption times. The findings reveal the significant influence of glazing characteristics on thermal resilience, with the morning power outage scenario posing the most detrimental impact in terms of the shortest duration before compromising thermal resilience.

Keywords: thermal resilience, thermal envelope, energy modeling, building simulation, thermal comfort, power disruption, extreme weather

Procedia PDF Downloads 66
12587 Transferring Cultural Meanings: A Case of Translation Classroom

Authors: Ramune Kasperaviciene, Jurgita Motiejuniene, Dalia Venckiene

Abstract:

Familiarising students with strategies for transferring cultural meanings (intertextual units, culture-specific idioms, culture-specific items, etc.) should be part of a comprehensive translator training programme. The present paper focuses on strategies for transferring such meanings into other languages and explores possibilities for introducing these methods and practice to translation students. The authors (university translation teachers) analyse the means of transferring cultural meanings from English into Lithuanian in a specific travel book, attribute these means to theoretically grounded strategies, and make calculations related to the frequency of adoption of specific strategies; translation students are familiarised with concepts and methods related to transferring cultural meanings and asked to put their theoretical knowledge into practice, i.e. interpret and translate certain culture-specific items from the same source text, and ground their decisions on theory; the comparison of the strategies employed by the professional translator of the source text (as identified by the authors of this study) and by the students is made. As a result, both students and teachers gain valuable experience, and new practices of conducting translation classes for a specific purpose evolve. Conclusions highlight the differences and similarities of non-professional and professional choices, summarise the possibilities for introducing methods of transferring cultural meanings to students, and round up with specific considerations of the impact of theoretical knowledge and the degree of experience on decisions made in the translation process.

Keywords: cultural meanings, culture-specific items, strategies for transferring cultural meanings, translator training

Procedia PDF Downloads 338
12586 Revolutionizing Legal Drafting: Leveraging Artificial Intelligence for Efficient Legal Work

Authors: Shreya Poddar

Abstract:

Legal drafting and revising are recognized as highly demanding tasks for legal professionals. This paper introduces an approach to automate and refine these processes through the use of advanced Artificial Intelligence (AI). The method employs Large Language Models (LLMs), with a specific focus on 'Chain of Thoughts' (CoT) and knowledge injection via prompt engineering. This approach differs from conventional methods that depend on comprehensive training or fine-tuning of models with extensive legal knowledge bases, which are often expensive and time-consuming. The proposed method incorporates knowledge injection directly into prompts, thereby enabling the AI to generate more accurate and contextually appropriate legal texts. This approach substantially decreases the necessity for thorough model training while preserving high accuracy and relevance in drafting. Additionally, the concept of guardrails is introduced. These are predefined parameters or rules established within the AI system to ensure that the generated content adheres to legal standards and ethical guidelines. The practical implications of this method for legal work are considerable. It has the potential to markedly lessen the time lawyers allocate to document drafting and revision, freeing them to concentrate on more intricate and strategic facets of legal work. Furthermore, this method makes high-quality legal drafting more accessible, possibly reducing costs and expanding the availability of legal services. This paper will elucidate the methodology, providing specific examples and case studies to demonstrate the effectiveness of 'Chain of Thoughts' and knowledge injection in legal drafting. The potential challenges and limitations of this approach will also be discussed, along with future prospects and enhancements that could further advance legal work. The impact of this research on the legal industry is substantial. The adoption of AI-driven methods by legal professionals can lead to enhanced efficiency, precision, and consistency in legal drafting, thereby altering the landscape of legal work. This research adds to the expanding field of AI in law, introducing a method that could significantly alter the nature of legal drafting and practice.

Keywords: AI-driven legal drafting, legal automation, futureoflegalwork, largelanguagemodels

Procedia PDF Downloads 52
12585 Feasibility of Washing/Extraction Treatment for the Remediation of Deep-Sea Mining Trailings

Authors: Kyoungrean Kim

Abstract:

Importance of deep-sea mineral resources is dramatically increasing due to the depletion of land mineral resources corresponding to increasing human’s economic activities. Korea has acquired exclusive exploration licenses at four areas which are the Clarion-Clipperton Fracture Zone in the Pacific Ocean (2002), Tonga (2008), Fiji (2011) and Indian Ocean (2014). The preparation for commercial mining of Nautilus minerals (Canada) and Lockheed martin minerals (USA) is expected by 2020. The London Protocol 1996 (LP) under International Maritime Organization (IMO) and International Seabed Authority (ISA) will set environmental guidelines for deep-sea mining until 2020, to protect marine environment. In this research, the applicability of washing/extraction treatment for the remediation of deep-sea mining tailings was mainly evaluated in order to present preliminary data to develop practical remediation technology in near future. Polymetallic nodule samples were collected at the Clarion-Clipperton Fracture Zone in the Pacific Ocean, then stored at room temperature. Samples were pulverized by using jaw crusher and ball mill then, classified into 3 particle sizes (> 63 µm, 63-20 µm, < 20 µm) by using vibratory sieve shakers (Analysette 3 Pro, Fritsch, Germany) with 63 µm and 20 µm sieve. Only the particle size 63-20 µm was used as the samples for investigation considering the lower limit of ore dressing process which is tens to 100 µm. Rhamnolipid and sodium alginate as biosurfactant and aluminum sulfate which are mainly used as flocculant were used as environmentally friendly additives. Samples were adjusted to 2% liquid with deionized water then mixed with various concentrations of additives. The mixture was stirred with a magnetic bar during specific reaction times and then the liquid phase was separated by a centrifugal separator (Thermo Fisher Scientific, USA) under 4,000 rpm for 1 h. The separated liquid was filtered with a syringe and acrylic-based filter (0.45 µm). The extracted heavy metals in the filtered liquid were then determined using a UV-Vis spectrometer (DR-5000, Hach, USA) and a heat block (DBR 200, Hach, USA) followed by US EPA methods (8506, 8009, 10217 and 10220). Polymetallic nodule was mainly composed of manganese (27%), iron (8%), nickel (1.4%), cupper (1.3 %), cobalt (1.3%) and molybdenum (0.04%). Based on remediation standards of various countries, Nickel (Ni), Copper (Cu), Cadmium (Cd) and Zinc (Zn) were selected as primary target materials. Throughout this research, the use of rhamnolipid was shown to be an effective approach for removing heavy metals in samples originated from manganese nodules. Sodium alginate might also be one of the effective additives for the remediation of deep-sea mining tailings such as polymetallic nodules. Compare to the use of rhamnolipid and sodium alginate, aluminum sulfate was more effective additive at short reaction time within 4 h. Based on these results, sequencing particle separation, selective extraction/washing, advanced filtration of liquid phase, water treatment without dewatering and solidification/stabilization may be considered as candidate technologies for the remediation of deep-sea mining tailings.

Keywords: deep-sea mining tailings, heavy metals, remediation, extraction, additives

Procedia PDF Downloads 151