Search results for: artificial potential approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24014

Search results for: artificial potential approach

21494 Unlocking the Potential of Neglected Cereal Resources Waste: Exploring Functional Properties of Algerian Pearl Millet Starch via Wet Milling and Ultrasound Techniques

Authors: Sarra Bouhallel, Sara Legbedj, Rima Messaoud, Sofia Saffarbatti

Abstract:

In the context of global waste management and sustainable resource utilization, millets emerge as a vital yet underutilized cereal resource. Despite their exceptional nutritional profile and resilience to harsh environmental conditions, their potential remains largely untapped. This study aims to contribute to the valorization of seven Algerian pearl millet landraces (Pennisetum glaucum (L.) R. Br) from the southern region by focusing on the characterization of their starches. Utilizing both conventional wet milling, incorporating sodium azide as a microbial growth inhibitor, and a novel green technology—Ultrasound-assisted isolation, we explore avenues for enhancing the functional properties of these starches. Analysis of key functional properties such as swelling power and water solubility index reveals significant enhancements, particularly during heat treatment near the gelatinization temperature [70 - 80 °C]. Furthermore, our investigation into the influence of pre-treatment methods on isolated starches highlights the potential of Ultrasound-assisted isolation in reducing absorbency and water solubility compared to conventional methods. Through rigorous data analysis using SPSS software (Version 23), we ascertain the efficiency of Ultrasound-assisted isolation, underscoring its promising role in the valorization of pearl millet waste. This research not only sheds light on the functional properties of pearl millet starch but also underscores the imperative of sustainable waste management in harnessing the full potential of underutilized cereal resources.

Keywords: isolation, solubility, starch, swelling, ultrasound

Procedia PDF Downloads 47
21493 Worm Gearing Design Improvement by Considering Varying Mesh Stiffness

Authors: A. H. Elkholy, A. H. Falah

Abstract:

A new approach has been developed to estimate the load share and stress distribution of worm gear sets. The approach is based upon considering the instantaneous tooth meshing stiffness where the worm gear drive was modelled as a series of spur gear slices, and each slice was analyzed separately using the well established formulae of spur gears. By combining the results obtained for all slices, the entire envolute worm gear set loading and stressing was obtained. The geometric modelling method presented, allows tooth elastic deformation and tooth root stresses of worm gear drives under different load conditions to be investigated. On the basis of the method introduced in this study, the instantaneous meshing stiffness and load share were obtained. In comparison with existing methods, this approach has both good analysis accuracy and less computing time.

Keywords: gear, load/stress distribution, worm, wheel, tooth stiffness, contact line

Procedia PDF Downloads 340
21492 Polymer Mediated Interaction between Grafted Nanosheets

Authors: Supriya Gupta, Paresh Chokshi

Abstract:

Polymer-particle interactions can be effectively utilized to produce composites that possess physicochemical properties superior to that of neat polymer. The incorporation of fillers with dimensions comparable to polymer chain size produces composites with extra-ordinary properties owing to very high surface to volume ratio. The dispersion of nanoparticles is achieved by inducing steric repulsion realized by grafting particles with polymeric chains. A comprehensive understanding of the interparticle interaction between these functionalized nanoparticles plays an important role in the synthesis of a stable polymer nanocomposite. With the focus on incorporation of clay sheets in a polymer matrix, we theoretically construct the polymer mediated interparticle potential for two nanosheets grafted with polymeric chains. The self-consistent field theory (SCFT) is employed to obtain the inhomogeneous composition field under equilibrium. Unlike the continuum models, SCFT is built from the microscopic description taking in to account the molecular interactions contributed by both intra- and inter-chain potentials. We present the results of SCFT calculations of the interaction potential curve for two grafted nanosheets immersed in the matrix of polymeric chains of dissimilar chemistry to that of the grafted chains. The interaction potential is repulsive at short separation and shows depletion attraction for moderate separations induced by high grafting density. It is found that the strength of attraction well can be tuned by altering the compatibility between the grafted and the mobile chains. Further, we construct the interaction potential between two nanosheets grafted with diblock copolymers with one of the blocks being chemically identical to the free polymeric chains. The interplay between the enthalpic interaction between the dissimilar species and the entropy of the free chains gives rise to a rich behavior in interaction potential curve obtained for two separate cases of free chains being chemically similar to either the grafted block or the free block of the grafted diblock chains.

Keywords: clay nanosheets, polymer brush, polymer nanocomposites, self-consistent field theory

Procedia PDF Downloads 248
21491 Data-Driven Dynamic Overbooking Model for Tour Operators

Authors: Kannapha Amaruchkul

Abstract:

We formulate a dynamic overbooking model for a tour operator, in which most reservations contain at least two people. The cancellation rate and the timing of the cancellation may depend on the group size. We propose two overbooking policies, namely economic- and service-based. In an economic-based policy, we want to minimize the expected oversold and underused cost, whereas, in a service-based policy, we ensure that the probability of an oversold situation does not exceed the pre-specified threshold. To illustrate the applicability of our approach, we use tour package data in 2016-2018 from a tour operator in Thailand to build a data-driven robust optimization model, and we tested the proposed overbooking policy in 2019. We also compare the data-driven approach to the conventional approach of fitting data into a probability distribution.

Keywords: applied stochastic model, data-driven robust optimization, overbooking, revenue management, tour operator

Procedia PDF Downloads 125
21490 An Effective Approach to Knowledge Capture in Whole Life Costing in Constructions Project

Authors: Ndibarafinia Young Tobin, Simon Burnett

Abstract:

In spite of the benefits of implementing whole life costing technique as a valuable approach for comparing alternative building designs allowing operational cost benefits to be evaluated against any initial cost increases and also as part of procurement in the construction industry, its adoption has been relatively slow due to the lack of tangible evidence, ‘know-how’ skills and knowledge of the practice, i.e. the lack of professionals in many establishments with knowledge and training on the use of whole life costing technique, this situation is compounded by the absence of available data on whole life costing from relevant projects, lack of data collection mechanisms and so on. This has proved to be very challenging to those who showed some willingness to employ the technique in a construction project. The knowledge generated from a project can be considered as best practices learned on how to carry out tasks in a more efficient way, or some negative lessons learned which have led to losses and slowed down the progress of the project and performance. Knowledge management in whole life costing practice can enhance whole life costing analysis execution in a construction project, as lessons learned from one project can be carried on to future projects, resulting in continuous improvement, providing knowledge that can be used in the operation and maintenance phases of an assets life span. Purpose: The purpose of this paper is to report an effective approach which can be utilised in capturing knowledge in whole life costing practice in a construction project. Design/methodology/approach: An extensive literature review was first conducted on the concept of knowledge management and whole life costing. This was followed by a semi-structured interview to explore the existing and good practice knowledge management in whole life costing practice in a construction project. The data gathered from the semi-structured interview was analyzed using content analysis and used to structure an effective knowledge capturing approach. Findings: From the results obtained in the study, it shows that the practice of project review is the common method used in the capturing of knowledge and should be undertaken in an organized and accurate manner, and results should be presented in the form of instructions or in a checklist format, forming short and precise insights. The approach developed advised that irrespective of how effective the approach to knowledge capture, the absence of an environment for sharing knowledge, would render the approach ineffective. Open culture and resources are critical for providing a knowledge sharing setting, and leadership has to sustain whole life costing knowledge capture, giving full support for its implementation. The knowledge capturing approach has been evaluated by practitioners who are experts in the area of whole life costing practice. The results have indicated that the approach to knowledge capture is suitable and efficient.

Keywords: whole life costing, knowledge capture, project review, construction industry, knowledge management

Procedia PDF Downloads 256
21489 The Potential Threat of Cyberterrorism to the National Security: Theoretical Framework

Authors: Abdulrahman S. Alqahtani

Abstract:

The revolution of computing and networks could revolutionise terrorism in the same way that it has brought about changes in other aspects of life. The modern technological era has faced countries with a new set of security challenges. There are many states and potential adversaries who have the potential and capacity in cyberspace, which makes them able to carry out cyber-attacks in the future. Some of them are currently conducting surveillance, gathering and analysis of technical information, and mapping of networks and nodes and infrastructure of opponents, which may be exploited in future conflicts. This poster presents the results of the quantitative study (survey) to test the validity of the proposed theoretical framework for the cyber terrorist threats. This theoretical framework will help to in-depth understand these new digital terrorist threats. It may also be a practical guide for managers and technicians in critical infrastructure, to understand and assess the threats they face. It might also be the foundation for building a national strategy to counter cyberterrorism. In the beginning, it provides basic information about the data. To purify the data, reliability and exploratory factor analysis, as well as confirmatory factor analysis (CFA) were performed. Then, Structural Equation Modelling (SEM) was utilised to test the final model of the theory and to assess the overall goodness-of-fit between the proposed model and the collected data set.

Keywords: cyberterrorism, critical infrastructure, , national security, theoretical framework, terrorism

Procedia PDF Downloads 392
21488 An Approach to Manage and Evaluate Asset Performance

Authors: Mohammed Saif Al-Saidi, John P. T. Mo

Abstract:

Modern engineering assets are complex and very high in value. They are expected to function for years to come, with ability to handle the change in technology and ageing modification. The aging of an engineering asset and continues increase of vendors and contractors numbers forces the asset operation management (or Owner) to design an asset system which can capture these changes. Furthermore, an accurate performance measurement and risk evaluation processes are highly needed. Therefore, this paper explores the nature of the asset management system performance evaluation for an engineering asset based on the System Support Engineering (SSE) principles. The research work explores the asset support system from a range of perspectives, interviewing managers from across a refinery organisation. The factors contributing to complexity of an asset management system are described in context which clusters them into several key areas. It is proposed that SSE framework may then be used as a tool for analysis and management of asset. The paper will conclude with discussion of potential application of the framework and opportunities for future research.

Keywords: asset management, performance, evaluation, modern engineering, System Support Engineering (SSE)

Procedia PDF Downloads 673
21487 The Computational Psycholinguistic Situational-Fuzzy Self-Controlled Brain and Mind System Under Uncertainty

Authors: Ben Khayut, Lina Fabri, Maya Avikhana

Abstract:

The models of the modern Artificial Narrow Intelligence (ANI) cannot: a) independently and continuously function without of human intelligence, used for retraining and reprogramming the ANI’s models, and b) think, understand, be conscious, cognize, infer, and more in state of Uncertainty, and changes in situations, and environmental objects. To eliminate these shortcomings and build a new generation of Artificial Intelligence systems, the paper proposes a Conception, Model, and Method of Computational Psycholinguistic Cognitive Situational-Fuzzy Self-Controlled Brain and Mind System (CPCSFSCBMSUU) using a neural network as its computational memory, operating under uncertainty, and activating its functions by perception, identification of real objects, fuzzy situational control, forming images of these objects, modeling their psychological, linguistic, cognitive, and neural values of properties and features, the meanings of which are identified, interpreted, generated, and formed taking into account the identified subject area, using the data, information, knowledge, and images, accumulated in the Memory. The functioning of the CPCSFSCBMSUU is carried out by its subsystems of the: fuzzy situational control of all processes, computational perception, identifying of reactions and actions, Psycholinguistic Cognitive Fuzzy Logical Inference, Decision making, Reasoning, Systems Thinking, Planning, Awareness, Consciousness, Cognition, Intuition, Wisdom, analysis and processing of the psycholinguistic, subject, visual, signal, sound and other objects, accumulation and using the data, information and knowledge in the Memory, communication, and interaction with other computing systems, robots and humans in order of solving the joint tasks. To investigate the functional processes of the proposed system, the principles of Situational Control, Fuzzy Logic, Psycholinguistics, Informatics, and modern possibilities of Data Science were applied. The proposed self-controlled System of Brain and Mind is oriented on use as a plug-in in multilingual subject Applications.

Keywords: computational brain, mind, psycholinguistic, system, under uncertainty

Procedia PDF Downloads 167
21486 Model-Based Software Regression Test Suite Reduction

Authors: Shiwei Deng, Yang Bao

Abstract:

In this paper, we present a model-based regression test suite reducing approach that uses EFSM model dependence analysis and probability-driven greedy algorithm to reduce software regression test suites. The approach automatically identifies the difference between the original model and the modified model as a set of elementary model modifications. The EFSM dependence analysis is performed for each elementary modification to reduce the regression test suite, and then the probability-driven greedy algorithm is adopted to select the minimum set of test cases from the reduced regression test suite that cover all interaction patterns. Our initial experience shows that the approach may significantly reduce the size of regression test suites.

Keywords: dependence analysis, EFSM model, greedy algorithm, regression test

Procedia PDF Downloads 419
21485 Energy Efficient Massive Data Dissemination Through Vehicle Mobility in Smart Cities

Authors: Salman Naseer

Abstract:

One of the main challenges of operating a smart city (SC) is collecting the massive data generated from multiple data sources (DS) and to transmit them to the control units (CU) for further data processing and analysis. These ever-increasing data demands require not only more and more capacity of the transmission channels but also results in resource over-provision to meet the resilience requirements, thus the unavoidable waste because of the data fluctuations throughout the day. In addition, the high energy consumption (EC) and carbon discharges from these data transmissions posing serious issues to the environment we live in. Therefore, to overcome the issues of intensive EC and carbon emissions (CE) of massive data dissemination in Smart Cities, we propose an energy efficient and carbon reduction approach by utilizing the daily mobility of the existing vehicles as an alternative communications channel to accommodate the data dissemination in smart cities. To illustrate the effectiveness and efficiency of our approach, we take the Auckland City in New Zealand as an example, assuming massive data generated by various sources geographically scattered throughout the Auckland region to the control centres located in city centre. The numerical results show that our proposed approach can provide up to 5 times lower delay as transferring the large volume of data by utilizing the existing daily vehicles’ mobility than the conventional transmission network. Moreover, our proposed approach offers about 30% less EC and CE than that of conventional network transmission approach.

Keywords: smart city, delay tolerant network, infrastructure offloading, opportunistic network, vehicular mobility, energy consumption, carbon emission

Procedia PDF Downloads 136
21484 A Framework for Auditing Multilevel Models Using Explainability Methods

Authors: Debarati Bhaumik, Diptish Dey

Abstract:

Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.

Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics

Procedia PDF Downloads 87
21483 Cognitive Science Based Scheduling in Grid Environment

Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya

Abstract:

Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.

Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence

Procedia PDF Downloads 390
21482 Isolation and Identification of Microorganisms from Marine-Associated Samples under Laboratory Conditions

Authors: Sameen Tariq, Saira Bano, Sayyada Ghufrana Nadeem

Abstract:

The Ocean, which covers over 70% of the world's surface, is wealthy in biodiversity as well as a rich wellspring of microorganisms with huge potential. The oceanic climate is home to an expansive scope of plants, creatures, and microorganisms. Marine microbial networks, which incorporate microscopic organisms, infections, and different microorganisms, enjoy different benefits in biotechnological processes. Samples were collected from marine environments, including soil and water samples, to cultivate the uncultured marine organisms by using Zobell’s medium, Sabouraud’s dextrose agar, and casein media for this purpose. Following isolation, we conduct microscopy and biochemical tests, including gelatin, starch, glucose, casein, catalase, and carbohydrate hydrolysis for further identification. The results show that more gram-positive and gram-negative bacteria. The isolation process of marine organisms is essential for understanding their ecological roles, unraveling their biological secrets, and harnessing their potential for various applications. Marine organisms exhibit remarkable adaptations to thrive in the diverse and challenging marine environment, offering vast potential for scientific, medical, and industrial applications. The isolation process plays a crucial role in unlocking the secrets of marine organisms, understanding their biological functions, and harnessing their valuable properties. They offer a rich source of bioactive compounds with pharmaceutical potential, including antibiotics, anticancer agents, and novel therapeutics. This study is an attempt to explore the diversity and dynamics related to marine microflora and their role in biofilm formation.

Keywords: marine microorganisms, ecosystem, fungi, biofilm, gram-positive, gram-negative

Procedia PDF Downloads 37
21481 A Case Study on English Camp in UNISSA: An Approach towards Interactive Learning Outside the Classroom

Authors: Liza Mariah Hj. Azahari

Abstract:

This paper will look at a case study on English Camp which was an activity coordinated at the Sultan Sharif Ali Islamic University in 2011. English Camp is a fun and motivation filled activity which brings students and teachers together outside of the classroom setting into a more diverse environment. It also enables teacher and students to gain proximate time together for a mutual purpose which is to explore the language in a more dynamic and relaxed way. First of all, the study will look into the background of English Camp, and how it was introduced and implemented from different contexts. Thereafter, it will explain the objectives of the English Camp coordinated at our university, UNISSA, and what types of activities were conducted. It will then evaluate the effectiveness of the camp as to what extent it managed to meet its motto, which was to foster dynamic interactive learning of English Language. To conclude, the paper presents a potential for further research on the topic as well as a guideline for educators who wish to coordinate the activity. Proposal for collaboration in this activity is further highlighted and encouraged within the paper for future implementation and endeavor.

Keywords: English camp, UNISSA, interactive learning, outside

Procedia PDF Downloads 562
21480 First-Principles Investigation of the Structural and Electronic Properties of Mg1-xBixO

Authors: G. P. Abdel Rahim, M. María Guadalupe Moreno Armenta, Jairo Arbey Rodriguez

Abstract:

We investigated the structure and electronic properties of the compound Mg1-xBixO with varying concentrations of 0, ¼, ½, and ¾ x bismuth in the the NaCl (rock-salt) and WZ (wurtzite) phases. The calculations were performed using the first-principles pseudo-potential method within the framework of spin density functional theory (DFT). Our calculations predict that for Bi concentrations greater than ~70%, the WZ structure is more favorable than the NaCl one and that for x = 0 (pure MgO), x = 0.25 and x = 0.50 of Bi concentration the NaCl structure is more favorable than the WZ one. For x = 0.75 of Bi, a transition from wurtzite towards NaCl is possible, when the pressure is about 22 GPa. Also It has been observed the crystal lattice constant closely follows Vegard’s law, that the bulk modulus and the cohesion energy decrease with the concentration x of Bi.

Keywords: DFT, Mg1-xBixO, pseudo-potential, rock-salt, wurtzite

Procedia PDF Downloads 519
21479 A Review of the Major Factors of Cost Overrun in Construction Projects

Authors: Hassan Abdelgadir

Abstract:

Cost overruns have harmed the economies and reputations of several construction companies around the world. Many project management systems have been developed to keep track of a project's budget. However, due to several cost overrun difficulties in the construction industry, cost management is still deemed inadequate. As a result, the goal of this term paper is to identify and group prospective construction project cost overrun reasons based on their origin groups.Basically, all potential cost overrun elements were rigorously checked through 1iterature analysis before being divided into seven (7) groups of originating components, including project, contract, client, contractor, consultant, labor, and external. Each potential factor was completely defined with examples.

Keywords: construction projects, cost overruns, construction company, factors effect costoverruns

Procedia PDF Downloads 65
21478 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence

Authors: Gus Calderon, Richard McCreight, Tammy Schwartz

Abstract:

Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.

Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.

Procedia PDF Downloads 101
21477 Some New Bounds for a Real Power of the Normalized Laplacian Eigenvalues

Authors: Ayşe Dilek Maden

Abstract:

For a given a simple connected graph, we present some new bounds via a new approach for a special topological index given by the sum of the real number power of the non-zero normalized Laplacian eigenvalues. To use this approach presents an advantage not only to derive old and new bounds on this topic but also gives an idea how some previous results in similar area can be developed.

Keywords: degree Kirchhoff index, normalized Laplacian eigenvalue, spanning tree, simple connected graph

Procedia PDF Downloads 361
21476 Analyzing Political Cartoons in Arabic-Language Media after Trump's Jerusalem Move: A Multimodal Discourse Perspective

Authors: Inas Hussein

Abstract:

Communication in the modern world is increasingly becoming multimodal due to globalization and the digital space we live in which have remarkably affected how people communicate. Accordingly, Multimodal Discourse Analysis (MDA) is an emerging paradigm in discourse studies with the underlying assumption that other semiotic resources such as images, colours, scientific symbolism, gestures, actions, music and sound, etc. combine with language in order to  communicate meaning. One of the effective multimodal media that combines both verbal and non-verbal elements to create meaning is political cartoons. Furthermore, since political and social issues are mirrored in political cartoons, these are regarded as potential objects of discourse analysis since they not only reflect the thoughts of the public but they also have the power to influence them. The aim of this paper is to analyze some selected cartoons on the recognition of Jerusalem as Israel's capital by the American President, Donald Trump, adopting a multimodal approach. More specifically, the present research examines how the various semiotic tools and resources utilized by the cartoonists function in projecting the intended meaning. Ten political cartoons, among a surge of editorial cartoons highlighted by the Anti-Defamation League (ADL) - an international Jewish non-governmental organization based in the United States - as publications in different Arabic-language newspapers in Egypt, Saudi Arabia, UAE, Oman, Iran and UK, were purposively selected for semiotic analysis. These editorial cartoons, all published during 6th–18th December 2017, invariably suggest one theme: Jewish and Israeli domination of the United States. The data were analyzed using the framework of Visual Social Semiotics. In accordance with this methodological framework, the selected visual compositions were analyzed in terms of three aspects of meaning: representational, interactive and compositional. In analyzing the selected cartoons, an interpretative approach is being adopted. This approach prioritizes depth to breadth and enables insightful analyses of the chosen cartoons. The findings of the study reveal that semiotic resources are key elements of political cartoons due to the inherent political communication they convey. It is proved that adequate interpretation of the three aspects of meaning is a prerequisite for understanding the intended meaning of political cartoons. It is recommended that further research should be conducted to provide more insightful analyses of political cartoons from a multimodal perspective.

Keywords: Multimodal Discourse Analysis (MDA), multimodal text, political cartoons, visual modality

Procedia PDF Downloads 226
21475 A Systematic Review of Patient-Reported Outcomes and Return to Work after Surgical vs. Non-surgical Midshaft Humerus Fracture

Authors: Jamal Alasiri, Naif Hakeem, Saoud Almaslmani

Abstract:

Background: Patients with humeral shaft fractures have two different treatment options. Surgical therapy has lesser risks of non-union, mal-union, and re-intervention than non-surgical therapy. These positive clinical outcomes of the surgical approach make it a preferable treatment option despite the risks of radial nerve palsy and additional surgery-related risk. We aimed to evaluate patients’ outcomes and return to work after surgical vs. non-surgical management of shaft humeral fracture. Methods: We used databases, including PubMed, Medline, and Cochrane Register of Controlled Trials, from 2010 to January 2022 to search for potential randomised controlled trials (RCTs) and cohort studies comparing the patients’ related outcome measures and return to work between surgical and non-surgical management of humerus fracture. Results: After carefully evaluating 1352 articles, we included three RCTs (232 patients) and one cohort study (39 patients). The surgical intervention used plate/nail fixation, while the non-surgical intervention used a splint or brace procedure to manage shaft humeral fracture. The pooled DASH effects of all three RCTs at six (M.D: -7.5 [-13.20, -1.89], P: 0.009) I2:44%) and 12 months (M.D: -1.32 [-3.82, 1.17], p:0.29, I2: 0%) were higher in patients treated surgically than in non-surgical procedures. The pooled constant Murley score at six (M.D: 7.945[2.77,13.10], P: 0.003) I2: 0%) and 12 months (M.D: 1.78 [-1.52, 5.09], P: 0.29, I2: 0%) were higher in patients who received non-surgical than surgical therapy. However, pooled analysis for patients returning to work for both groups remained inconclusive. Conclusion: Altogether, we found no significant evidence supporting the clinical benefits of surgical over non-surgical therapy. Thus, the non-surgical approach remains the preferred therapeutic choice for managing shaft humeral fractures due to its lesser side effects.

Keywords: shaft humeral fracture, surgical treatment, Patient-related outcomes, return to work, DASH

Procedia PDF Downloads 93
21474 Park’s Vector Approach to Detect an Inter Turn Stator Fault in a Doubly Fed Induction Machine by a Neural Network

Authors: Amel Ourici

Abstract:

An electrical machine failure that is not identified in an initial stage may become catastrophic and it may suffer severe damage. Thus, undetected machine faults may cascade in it failure, which in turn may cause production shutdowns. Such shutdowns are costly in terms of lost production time, maintenance costs, and wasted raw materials. Doubly fed induction generators are used mainly for wind energy conversion in MW power plants. This paper presents a detection of an inter turn stator fault in a doubly fed induction machine whose stator and rotor are supplied by two pulse width modulation (PWM) inverters. The method used in this article to detect this fault, is based on Park’s Vector Approach, using a neural network.

Keywords: doubly fed induction machine, PWM inverter, inter turn stator fault, Park’s vector approach, neural network

Procedia PDF Downloads 600
21473 Biological Activity of Bilberry Pomace

Authors: Gordana S. Ćetković, Vesna T. Tumbas Šaponjac, Sonja M. Djilas, Jasna M. Čanadanović-Brunet, Sladjana M. Stajčić, Jelena J. Vulić

Abstract:

Bilberry is one of the most important dietary sources of phenolic compounds, including anthocyanins, phenolic acids, flavonol glycosides and flavan-3-ols. These phytochemicals have different biological activities and therefore may improve our health condition. Also, anthocyanins are interesting to the food industry as colourants. In the present study, bilberry pomace, a by-product of juice processing, was used as a potential source of bioactive compounds. The contents of total phenolic acids, flavonoids and anthocyanins in bilberry pomace were determined by HPLC/UV-Vis. The biological activities of bilberry pomace were evaluated by reducing power (RP) and α-glucosidase inhibitory potential (α-GIP), and expressed as RP0.5 value (the effective concentration of bilberry pomace extract assigned at 0.5 value of absorption) and IC50 value (the concentration of bilberry pomace extract necessary to inhibit 50% of α-glucosidase enzyme activity). Total phenolic acids content was 807.12 ± 25.16 mg/100 g pomace, flavonoids 54.36 ± 1.83mg/100 g pomace and anthocyanins 3426.18 ± 112.09 mg/100 g pomace. The RP0.5 value of bilberry pomace was 0.38 ± 0.02 mg/ml, while IC50 value was 1.82 ± 0.11 mg/ml. These results have revealed the potential for valorization of bilberry juice production by-products for further industrial use as a rich source of bioactive compounds and natural colourants (mainly anthocyanins).

Keywords: bilberry pomace, phenolics, antioxidant activity, reducing power, α-glucosidase enzyme activity

Procedia PDF Downloads 592
21472 Waste Minimization through Vermicompost: An Alternative Approach

Authors: Mary Fabiola

Abstract:

Vermicompost is the product or process of composting using various worms. Large-scale vermicomposting is practiced in Canada, Italy, Japan, Malaysia, the Philippines, and the United States. The vermicompost may be used for farming, landscaping, and creating compost tea or for sale. Some of these operations produce worms for bait and/or home vermicomposting. As a processing system, The vermicomposting of organic waste is very simple. Worms ingest the waste material-break it up in their rudimentary. Gizzards, consume the digestible/putrefiable portion and then excrete a stable, Humus-like material that can be immediately marketed. Vermitechnology can be a promising technique that has shown its potential in certain challenging areas like augmentation of food production, waste recycling, management of solid wastes etc. There is no doubt that in India, where on side pollution is increasing due to accumulation of organic wastes and on the other side there is shortage of organic manure, which could increase the fertility and productivity of the land and produce nutritive and safe food. So, the scope for vermicomposting is enormous.

Keywords: pollution, solid wastes, vermicompost, waste recycling

Procedia PDF Downloads 425
21471 Defect-Based Urgency Index for Bridge Maintenance Ranking and Prioritization

Authors: Saleh Abu Dabous, Khaled Hamad, Rami Al-Ruzouq

Abstract:

Bridge condition assessment and rating provide essential information needed for bridge management. This paper reviews bridge inspection and condition rating practices and introduces a defect-based urgency index. The index is estimated at the element-level based on the extent and severity of the different defects typical to the bridge element. The urgency index approach has the following advantages: (1) It facilitates judgment submission, i.e. instead of rating the bridge element with a specific linguistic overall expression (which can be subjective and used differently by different people), the approach is based on assessing the defects; (2) It captures multiple defects that can be present within a deteriorated element; and (3) It reflects how critical the element is through quantifying critical defects and their severity. The approach can be further developed and validated. It is expected to be useful for practical purposes as an early-warning system for critical bridge elements.

Keywords: condition rating, deterioration, inspection, maintenance

Procedia PDF Downloads 440
21470 Specialized Translation Teaching Strategies: A Corpus-Based Approach

Authors: Yingying Ding

Abstract:

This study presents a methodology of specialized translation with the objective of helping teachers to improve the strategies in teaching translation. In order to allow students to acquire skills to translate specialized texts, they need to become familiar with the semantic and syntactic features of source texts and target texts. The aim of our study is to use a corpus-based approach in the teaching of specialized translation between Chinese and Italian. This study proposes to construct a specialized Chinese - Italian comparable corpus that consists of 50 economic contracts from the domain of food. With the help of AntConc, we propose to compile a comparable corpus in for translation teaching purposes. This paper attempts to provide insight into how teachers could benefit from comparable corpus in the teaching of specialized translation from Italian into Chinese and through some examples of passive sentences how students could learn to apply different strategies for translating appropriately the voice.

Keywords: contrastive studies, specialised translation, corpus-based approach, teaching

Procedia PDF Downloads 361
21469 Intensifying Approach for Separation of Bio-Butanol Using Ionic Liquid as Green Solvent: Moving Towards Sustainable Biorefinery

Authors: Kailas L. Wasewar

Abstract:

Biobutanol has been considered as a potential and alternative biofuel relative to the most popular biodiesel and bioethanol. End product toxicity is the major problems in commercialization of fermentation based process which can be reduce to some possible extent by removing biobutanol simultaneously. Several techniques have been investigated for removing butanol from fermentation broth such as stripping, adsorption, liquid–liquid extraction, pervaporation, and membrane solvent extraction. Liquid–liquid extraction can be performed with high selectivity and is possible to carry out inside the fermenter. Conventional solvents have few drawbacks including toxicity, loss of solvent, high cost etc. Hence alternative solvents must be explored for the same. Room temperature ionic liquids (RTILs) composed entirely of ions are liquid at room temperature having negligible vapor pressure, non-flammability, and tunable physiochemical properties for a particular application which term them as “designer solvents”. Ionic liquids (ILs) have recently gained much attention as alternatives for organic solvents in many processes. In particular, ILs have been used as alternative solvents for liquid–liquid extraction. Their negligible vapor pressure allows the extracted products to be separated from ILs by conventional low pressure distillation with the potential for saving energy. Morpholinium, imidazolium, ammonium, phosphonium etc. based ionic liquids have been employed for the separation biobutanol. In present chapter, basic concepts of ionic liquids and application in separation have been presented. Further, type of ionic liquids including, conventional, functionalized, polymeric, supported membrane, and other ionic liquids have been explored. Also the effect of various performance parameters on separation of biobutanol by ionic liquids have been discussed and compared for different cation and anion based ionic liquids. The typical methodology for investigation have been adopted such as contacting the equal amount of biobutanol and ionic liquids for a specific time say, 30 minutes to confirm the equilibrium. Further, biobutanol phase were analyzed using GC to know the concentration of biobutanol and material balance were used to find the concentration in ionic liquid.

Keywords: biobutanol, separation, ionic liquids, sustainability, biorefinery, waste biomass

Procedia PDF Downloads 79
21468 Validating Texture Analysis as a Tool for Determining Bioplastic (Bio)Degradation

Authors: Sally J. Price, Greg F. Walker, Weiyi Liu, Craig R. Bunt

Abstract:

Plastics, due to their long lifespan, are becoming more of an environmental concern once their useful life has been completed. There are a vast array of different types of plastic, and they can be found in almost every ecosystem on earth and are of particular concern in terrestrial environments where they can become incorporated into the food chain. Hence bioplastics have become more of interest to manufacturers and the public recently as they have the ability to (bio)degrade in commercial and in home composting situations. However, tools in which to quantify how they degrade in response to environmental variables are still being developed -one such approach is texture analysis using a TA.XT Texture Analyser, Stable Microsystems, was used to determine the force required to break or punch holes in standard ASTM D638 Type IV 3D printed bioplastic “dogbones” depending on the thicknesses of them. Manufacturers’ recommendations for calibrating the Texture Analyser are one such approach for standardising results; however, an independent technique using dummy dogbones and a substitute for the bioplastic was used alongside the samples. This approach was unexpectedly more valuable than realised at the start of the trial as irregular results were later discovered with the substitute material before valuable samples collected from the field were lost due to possible machine malfunction. This work will show the value of having an independent approach to machine calibration for accurate sample analysis with a Texture Analyser when analysing bioplastic samples.

Keywords: bioplastic, degradation, environment, texture analyzer

Procedia PDF Downloads 196
21467 Medicompills Architecture: A Mathematical Precise Tool to Reduce the Risk of Diagnosis Errors on Precise Medicine

Authors: Adriana Haulica

Abstract:

Powered by Machine Learning, Precise medicine is tailored by now to use genetic and molecular profiling, with the aim of optimizing the therapeutic benefits for cohorts of patients. As the majority of Machine Language algorithms come from heuristics, the outputs have contextual validity. This is not very restrictive in the sense that medicine itself is not an exact science. Meanwhile, the progress made in Molecular Biology, Bioinformatics, Computational Biology, and Precise Medicine, correlated with the huge amount of human biology data and the increase in computational power, opens new healthcare challenges. A more accurate diagnosis is needed along with real-time treatments by processing as much as possible from the available information. The purpose of this paper is to present a deeper vision for the future of Artificial Intelligence in Precise medicine. In fact, actual Machine Learning algorithms use standard mathematical knowledge, mostly Euclidian metrics and standard computation rules. The loss of information arising from the classical methods prevents obtaining 100% evidence on the diagnosis process. To overcome these problems, we introduce MEDICOMPILLS, a new architectural concept tool of information processing in Precise medicine that delivers diagnosis and therapy advice. This tool processes poly-field digital resources: global knowledge related to biomedicine in a direct or indirect manner but also technical databases, Natural Language Processing algorithms, and strong class optimization functions. As the name suggests, the heart of this tool is a compiler. The approach is completely new, tailored for omics and clinical data. Firstly, the intrinsic biological intuition is different from the well-known “a needle in a haystack” approach usually used when Machine Learning algorithms have to process differential genomic or molecular data to find biomarkers. Also, even if the input is seized from various types of data, the working engine inside the MEDICOMPILLS does not search for patterns as an integrative tool. This approach deciphers the biological meaning of input data up to the metabolic and physiologic mechanisms, based on a compiler with grammars issued from bio-algebra-inspired mathematics. It translates input data into bio-semantic units with the help of contextual information iteratively until Bio-Logical operations can be performed on the base of the “common denominator “rule. The rigorousness of MEDICOMPILLS comes from the structure of the contextual information on functions, built to be analogous to mathematical “proofs”. The major impact of this architecture is expressed by the high accuracy of the diagnosis. Detected as a multiple conditions diagnostic, constituted by some main diseases along with unhealthy biological states, this format is highly suitable for therapy proposal and disease prevention. The use of MEDICOMPILLS architecture is highly beneficial for the healthcare industry. The expectation is to generate a strategic trend in Precise medicine, making medicine more like an exact science and reducing the considerable risk of errors in diagnostics and therapies. The tool can be used by pharmaceutical laboratories for the discovery of new cures. It will also contribute to better design of clinical trials and speed them up.

Keywords: bio-semantic units, multiple conditions diagnosis, NLP, omics

Procedia PDF Downloads 63
21466 Application of Data Driven Based Models as Early Warning Tools of High Stream Flow Events and Floods

Authors: Mohammed Seyam, Faridah Othman, Ahmed El-Shafie

Abstract:

The early warning of high stream flow events (HSF) and floods is an important aspect in the management of surface water and rivers systems. This process can be performed using either process-based models or data driven-based models such as artificial intelligence (AI) techniques. The main goal of this study is to develop efficient AI-based model for predicting the real-time hourly stream flow (Q) and apply it as early warning tool of HSF and floods in the downstream area of the Selangor River basin, taken here as a paradigm of humid tropical rivers in Southeast Asia. The performance of AI-based models has been improved through the integration of the lag time (Lt) estimation in the modelling process. A total of 8753 patterns of Q, water level, and rainfall hourly records representing one-year period (2011) were utilized in the modelling process. Six hydrological scenarios have been arranged through hypothetical cases of input variables to investigate how the changes in RF intensity in upstream stations can lead formation of floods. The initial SF was changed for each scenario in order to include wide range of hydrological situations in this study. The performance evaluation of the developed AI-based model shows that high correlation coefficient (R) between the observed and predicted Q is achieved. The AI-based model has been successfully employed in early warning throughout the advance detection of the hydrological conditions that could lead to formations of floods and HSF, where represented by three levels of severity (i.e., alert, warning, and danger). Based on the results of the scenarios, reaching the danger level in the downstream area required high RF intensity in at least two upstream areas. According to results of applications, it can be concluded that AI-based models are beneficial tools to the local authorities for flood control and awareness.

Keywords: floods, stream flow, hydrological modelling, hydrology, artificial intelligence

Procedia PDF Downloads 243
21465 Invitro Study of Anti-Leishmanial Property of Nigella Sativa Methanalic Black Seed Extract

Authors: Tawqeer Ali Syed, Prakash Chandra

Abstract:

This study aims to evaluate the antileishmanial activity of Nigella sativa black seed extract. This well-known plant extract was taken from the botanical garden of Kashmir. Materials and Methods: The methanolic extracts of these plants were screened for their antileishmanial activity against Leishmania major using 3‑(4.5‑dimethylthiazol‑2yl)‑2.5‑diphenyltetrazolium bromide assay or MTT assay. Results: The methanolic extract of Nigella sativa showed potential antileishmanial activity at an inhibition% value of 80.29% ± 0.65%. IC 50 was calculated after 48 hours to be 964.3 µg/ml. Conclusion: Considering these results, these medicinal plants from Kashmir could serve as potential drug sources for antileishmanial compounds.

Keywords: MTT assay, antileishmanial, cell viability, Nigella sativa

Procedia PDF Downloads 202