Search results for: precision irrigation technologies
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4882

Search results for: precision irrigation technologies

1072 Hole Characteristics of Percussion and Single Pulse Laser-Incised Radiata Pine and the Effects of Wood Anatomy on Laser-Incision

Authors: Subhasisa Nath, David Waugh, Graham Ormondroyd, Morwenna Spear, Andy Pitman, Paul Mason

Abstract:

Wood is one of the most sustainable and environmentally favourable materials and is chemically treated in timber industries to maximise durability. To increase the chemical preservative uptake and retention by the wood, current limiting incision technologies are commonly used. This work reports the effects of single pulse CO2 laser-incision and frequency tripled Nd:YAG percussion laser-incision on the characteristics of laser-incised holes in the Radiata Pine. The laser-incision studies were based on changing laser wavelengths, energies and focal planes to conclude on an optimised combination for the laser-incision of Radiata Pine. The laser pulse duration had a dominant effect over laser power in controlling hole aspect ratio in CO2 laser-incision. A maximum depth of ~ 30 mm was measured with a laser power output of 170 W and a pulse duration of 80 ms. However, increased laser power led to increased carbonisation of holes. The carbonisation effect was reduced during laser-incision in the ultra-violet (UV) regime. Deposition of a foamy phase on the laser-incised hole wall was evident irrespective of laser radiation wavelength and energy. A maximum hole depth of ~20 mm was measured in the percussion laser-incision in the UV regime (355 nm) with a pulse energy of 320 mJ. The radial and tangential faces had a significant effect on laser-incision efficiency for all laser wavelengths. The laser-incised hole shapes and circularities were affected by the wood anatomy (earlywoods and latewoods in the structure). Subsequently, the mechanism of laser-incision is proposed by analysing the internal structure of laser-incised holes.

Keywords: CO2 Laser, Nd: YAG laser, incision, drilling, wood, hole characteristics

Procedia PDF Downloads 229
1071 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 61
1070 Electrocatalysts for Lithium-Sulfur Energy Storage Systems

Authors: Mirko Ante, Şeniz Sörgel, Andreas Bund

Abstract:

Li-S- (Lithium-Sulfur-) battery systems provide very high specific gravimetric energy (2600 Wh/kg) and volumetric energy density (2800Wh/l). Hence, Li-S batteries are one of the key technologies for both the upcoming electromobility and stationary applications. Furthermore, the Li-S battery system is potentially cheap and environmentally benign. However, the technical implementation suffers from cycling stability, low charge and discharge rates and incomplete understanding of the complex polysulfide reaction mechanism. The aim of this work is to develop an effective electrocatalyst for the polysulfide reactions so that the electrode kinetics of the sulfur half-cell will be improved. Accordingly, the overvoltage will be decreased, and the efficiency of the cell will be increased. An enhanced electroactive surface additionally improves the charge and discharge rates. To reach this goal, functionalized electrocatalytic coatings are investigated to accelerate the kinetics of the polysulfide reactions. In order to determine a suitable electrocatalyst, apparent exchange current densities of a variety of materials (Ni, Co, Pt, Cr, Al, Cu, ITO, stainless steel) have been evaluated in a polysulfide containing electrolyte by potentiodynamic measurements and a Butler-Volmer fit including diffusion limitation. The samples have been examined by Scanning Electron Microscopy (SEM) after the potentiodynamic measurements. Up to now, our work shows that cobalt is a promising material with good electrocatalytic properties for the polysulfide reactions and good chemical stability in the system. Furthermore, an electrodeposition from a modified Watt’s nickel electrolyte with a sulfur source seems to provide an autocatalytic effect, but the electrocatalytic behavior decreases after several cycles of the current-potential-curve.

Keywords: electrocatalyst, energy storage, lithium sulfur battery, sulfur electrode materials

Procedia PDF Downloads 360
1069 The Applications of Toyota Production System to Reduce Wastes in Agricultural Products Packing Process: A Study of Onion Packing Plant

Authors: P. Larpsomboonchai

Abstract:

Agro-industry is one of major industries that has strong impacts on national economic incomes, growth, stability, and sustainable development. Moreover, this industry also has strong influences on social, cultural and political issues. Furthermore, this industry, as producing primary and secondary products, is facing challenges from such diverse factors such as demand inconsistency, intense international competition, technological advancements and new competitors. In order to maintain and to improve industry’s competitiveness in both domestics and international markets, science and technology are key factors. Besides hard sciences and technologies, modern industrial engineering concepts such as Just in Time (JIT) Total Quality Management (TQM), Quick Response (QR), Supply Chain Management (SCM) and Lean can be very effective to supportant to increase efficiency and effectiveness of these agricultural products on world stage. Onion is one of Thailand’s major export products which brings back national incomes. But, it also facing challenges in many ways. This paper focused its interests in onion packing process and its related activities such as storage and shipment from one of major packing plant and storage in Mae Wang District, Chiang Mai, Thailand, by applying Toyota Production System (TPS) or Lean concepts, to improve process capability throughout the entire packing and distribution process which will be profitable for the whole onion supply chain. And it will be beneficial to other related agricultural products in Thailand and other ASEAN countries.

Keywords: packing process, Toyota Production System (TPS), lean concepts, waste reduction, lean in agro-industries activities

Procedia PDF Downloads 265
1068 Barriers and Enablers to Public Innovation in the Central Region of Colombia: A Characterization from Measurement through the Item Response Methodology and Comparative Analysis

Authors: Yessenia Parrado, Ana Barbosa, Daniela Mahe, Sebastian Toro, Jhon Garcia

Abstract:

The purpose of this work is to present the identification and characterization of the barriers and enablers to public innovation in the Central Region of Colombia from a mixed methodology in a research carried out in 2020 by the Laboratory of Innovation, Creativity and New Technologies of the National University of Colombia in alliance with the National Planning Department. Based on the research, the index of barriers to regional and departmental public innovation was built, which reflects the level of difficulty of the territorial entities to overcome the barriers present around three dimensions: organizational structure of the entity, generation of public value, and governance processes. The index was built from the item response methodology and the multiple correspondence analysis from the application of an institutional information form for public entities and a perception form for public servants. This investigation had the participation of 36 entities and 1038 servers and servants from the departments of Huila, Meta, Boyacá, Cundinamarca, Tolima, and the Capital District. In this exercise, it was identified that the departmental indices range between 13 and 44 and that the regional index was 30 out of 100. From the analysis of the information, it was possible to establish that the main barriers are the lack of specialized agencies for public innovation exercises, lack of qualified personnel and work methodologies for public innovation, inadequate information management, lack of feedback between the learning from governmental and non-governmental entities, the inability of the initiatives to generate binding participation mechanisms and the lack of qualification of citizens to participate in these processes.

Keywords: item response, public innovation, quantitative analysis, compared analysis

Procedia PDF Downloads 115
1067 Composite Coatings of Piezoelectric Quartz Sensors Based on Viscous Sorbents and Casein Micelles

Authors: Shuba Anastasiia, Kuchmenko Tatiana, Umarkhanov Ruslan

Abstract:

The development of new sensitive coatings for sensors is one of the key directions in the development of sensor technologies. Recently, there has been a trend towards the creation of multicomponent coatings for sensors, which make it possible to increase the sensitivity, and specificity, and improve the performance properties of sensors. When analyzing samples with a complex matrix of biological origin, the inclusion of micelles of bioactive substances (amino and nucleic acids, peptides, proteins) in the composition of the sensor coating can also increase useful analytical information. The purpose of this work is to evaluate the analytical characteristics of composite coatings of piezoelectric quartz sensors based on medium-molecular viscous sorbents with incorporated micellar casein concentrate during the sorption of vapors of volatile organic compounds. The sorption properties of the coatings were studied by piezoelectric quartz microbalance. Macromolecular compounds (dicyclohexyl-18-crown-6, triton X-100, lanolin, micellar casein concentrate) were used as sorbents. Highly volatile organic compounds of various classes (alcohols, acids, aldehydes, esters) and water were selected as test substances. It has been established that composite coatings of sensors with the inclusion of micellar casein are more stable and selective to vapors of highly volatile compounds than to water vapors. The method and technique of forming a composite coating using molecular viscous sorbents do not affect the kinetic features of VOC sorption. When casein micelles are used, the features of kinetic sorption depend on the matrix of the coating.

Keywords: piezoquartz sensor, viscous sorbents, micellar casein, coating, volatile compounds

Procedia PDF Downloads 98
1066 Telemedicine in Physician Assistant Education: A Partnership with Community Agency

Authors: Martina I. Reinhold, Theresa Bacon-Baguley

Abstract:

A core challenge of physician assistant education is preparing professionals for lifelong learning. While this conventionally has encompassed scientific advances, students must also embrace new care delivery models and technologies. Telemedicine, the provision of care via two-way audio and video, is an example of a technological advance reforming health care. During a three-semester sequence of Hospital Community Experiences, physician assistant students were assigned experiences with Answer Health on Demand, a telemedicine collaborative. Preceding the experiences, the agency lectured on the application of telemedicine. Students were then introduced to the technology and partnered with a provider. Prior to observing the patient-provider interaction, patient consent was obtained. Afterwards, students completed a reflection paper on lessons learned and the potential impact of telemedicine on their careers. Thematic analysis was completed on the students’ reflection papers (n=13). Preceding the lecture and experience, over 75% of students (10/13) were unaware of telemedicine. Several stated they were 'skeptical' about the effectiveness of 'impersonal' health care appointments. After the experience, all students remarked that telemedicine will play a large role in the future of healthcare and will provide benefits by improving access in rural areas, decreasing wait time, and saving cost. More importantly, 30% of students (4/13) commented that telemedicine is a technology they can see themselves using in their future practice. Initial results indicate that collaborative interaction between students and telemedicine providers enhanced student learning and exposed students to technological advances in the delivery of care. Further, results indicate that students perceived telemedicine more favorably as a viable delivery method after the experience.

Keywords: collaboration, physician assistant education, teaching innovative health care delivery method, telemedicine

Procedia PDF Downloads 187
1065 Human-Centric Sensor Networks for Comfort and Productivity in Offices: Integrating Environmental, Body Area Network, and Participatory Sensing

Authors: Chenlu Zhang, Wanni Zhang, Florian Schaule

Abstract:

Indoor environment in office buildings directly affects comfort, productivity, health, and well-being of building occupants. Wireless environmental sensor networks have been deployed in many modern offices to monitor and control the indoor environments. However, indoor environmental variables are not strong enough predictors of comfort and productivity levels of every occupant due to personal differences, both physiologically and psychologically. This study proposes human-centric sensor networks that integrate wireless environmental sensors, body area network sensors and participatory sensing technologies to collect data from both environment and human and support building operations. The sensor networks have been tested in one small-size and one medium-size office rooms with 22 participants for five months. Indoor environmental data (e.g., air temperature and relative humidity), physiological data (e.g., skin temperature and Galvani skin response), and physiological responses (e.g., comfort and self-reported productivity levels) were obtained from each participant and his/her workplace. The data results show that: (1) participants have different physiological and physiological responses in the same environmental conditions; (2) physiological variables are more effective predictors of comfort and productivity levels than environmental variables. These results indicate that the human-centric sensor networks can support human-centric building control and improve comfort and productivity in offices.

Keywords: body area network, comfort and productivity, human-centric sensors, internet of things, participatory sensing

Procedia PDF Downloads 132
1064 Numerical Investigation of the Integration of a Micro-Combustor with a Free Piston Stirling Engine in an Energy Recovery System

Authors: Ayodeji Sowale, Athanasios Kolios, Beatriz Fidalgo, Tosin Somorin, Aikaterini Anastasopoulou, Alison Parker, Leon Williams, Ewan McAdam, Sean Tyrrel

Abstract:

Recently, energy recovery systems are thriving and raising attention in the power generation sector, due to the request for cleaner forms of energy that are friendly and safe for the environment. This has created an avenue for cogeneration, where Combined Heat and Power (CHP) technologies have been recognised for their feasibility, and use in homes and small-scale businesses. The efficiency of combustors and the advantages of the free piston Stirling engines over other conventional engines in terms of output power and efficiency, have been observed and considered. This study presents the numerical analysis of a micro-combustor with a free piston Stirling engine in an integrated model of a Nano Membrane Toilet (NMT) unit. The NMT unit will use the micro-combustor to produce waste heat of high energy content from the combustion of human waste and the heat generated will power the free piston Stirling engine which will be connected to a linear alternator for electricity production. The thermodynamic influence of the combustor on the free piston Stirling engine was observed, based on the heat transfer from the flue gas to working gas of the free piston Stirling engine. The results showed that with an input of 25 MJ/kg of faecal matter, and flue gas temperature of 773 K from the micro-combustor, the free piston Stirling engine generates a daily output power of 428 W, at thermal efficiency of 10.7% with engine speed of 1800 rpm. An experimental investigation into the integration of the micro-combustor and free piston Stirling engine with the NMT unit is currently underway.

Keywords: free piston stirling engine, micro-combustor, nano membrane toilet, thermodynamics

Procedia PDF Downloads 248
1063 Coupled Hydro-Geomechanical Modeling of Oil Reservoir Considering Non-Newtonian Fluid through a Fracture

Authors: Juan Huang, Hugo Ninanya

Abstract:

Oil has been used as a source of energy and supply to make materials, such as asphalt or rubber for many years. This is the reason why new technologies have been implemented through time. However, research still needs to continue increasing due to new challenges engineers face every day, just like unconventional reservoirs. Various numerical methodologies have been applied in petroleum engineering as tools in order to optimize the production of reservoirs before drilling a wellbore, although not all of these have the same efficiency when talking about studying fracture propagation. Analytical methods like those based on linear elastic fractures mechanics fail to give a reasonable prediction when simulating fracture propagation in ductile materials whereas numerical methods based on the cohesive zone method (CZM) allow to represent the elastoplastic behavior in a reservoir based on a constitutive model; therefore, predictions in terms of displacements and pressure will be more reliable. In this work, a hydro-geomechanical coupled model of horizontal wells in fractured rock was developed using ABAQUS; both extended element method and cohesive elements were used to represent predefined fractures in a model (2-D). A power law for representing the rheological behavior of fluid (shear-thinning, power index <1) through fractures and leak-off rate permeating to the matrix was considered. Results have been showed in terms of aperture and length of the fracture, pressure within fracture and fluid loss. It was showed a high infiltration rate to the matrix as power index decreases. A sensitivity analysis is conclusively performed to identify the most influential factor of fluid loss.

Keywords: fracture, hydro-geomechanical model, non-Newtonian fluid, numerical analysis, sensitivity analysis

Procedia PDF Downloads 195
1062 Going Global by Going Local-How Website Localization and Translation Can Break the Internet Language Barrier and Contribute to Globalization

Authors: Hela Fathallah

Abstract:

With 6,500 spoken languages all over the world but 80 percent of online content available only in 10 languages – English, Chinese, Spanish, Japanese, Arabic, Portuguese, German, French, Russian, and Korean – language represents a barrier to the universal access to knowledge, information and services that the internet wants to provide. Translation and its related fields of localization, interpreting, globalization, and internationalization, remove that barrier for billions of people worldwide, unlocking new markets for technology companies, mobile device makers, service providers and language vendors as well. This paper gathers different surveys conducted in different regions of the world that demonstrate a growing demand for consumption of web content with distinctive values and in languages others than the aforementioned ones. It also adds new insights to the contribution of translation in languages preservation. The idea that English is the language of internet and that, in a globalized world, everyone should learn English to cope with new technologies is no longer true. This idea has reached its limits. It collides with cultural diversity and differences around the world and generates an accelerated rate of languages extinction. Studies prove that internet exacerbates this rate and web giants such as Facebook or Google are, today, facing the impact of such a misconception of globalization. For internet and dot-com companies, localization is the solution; they are spending a significant amount of time to understand what people want and to figure out how to provide it. They are committed to making their content accessible, if not in all the languages spoken today, at least in most of them, and to adapting it to most cultures. Technology has broken down the barriers of time and space, and it will break down the language barrier as well by undertaking a process of translation and localization and through a new definition of globalization that takes into consideration these two processes.

Keywords: globalization, internet, localization, translation

Procedia PDF Downloads 355
1061 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality

Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan

Abstract:

Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.

Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application

Procedia PDF Downloads 67
1060 Evaluation of Digital Marketing Strategies by Behavioral Economics

Authors: Sajjad Esmaeili Aghdam

Abstract:

Economics typically conceptualizes individual behavior as the consequence of external states, for example, budgets and prices (or respective beliefs) and choices. As the main goal, we focus on the influence of a range of Behavioral Economics factors on Strategies of Digital Marketing, evaluation of strategies and deformation of it into highly prospective marketing strategies. The different forms of behavioral prospects all lead to the succeeding two main results. First, the steadiness of the economic dynamics in a currency union be contingent fatefully on the level of economic incorporation. More economic incorporation leads to more steady economic dynamics. Electronic word-of-mouth (eWOM) is “all casual communications focused at consumers through Internet-based technology connected to the usage or characteristics of specific properties and services or their venders.” eWOM can take many methods, the most significant one being online analyses. Writing this paper, 72 articles have been gathered, focusing on the title and the aim of the article from research search engines like Google Scholar, Web of Science, and PubMed. Recent research in strategic management and marketing proposes that markets should not be viewed as a given and deterministic setting, exogenous to the firm. Instead, firms are progressively abstracted as dynamic inventors of market prospects. The use of new technologies touches all spheres of the modern lifestyle. Social and economic life becomes unbearable without fast, applicable, first-class and fitting material. Psychology and economics (together known as behavioral economics) are two protruding disciplines underlying many theories in marketing. The wide marketing works papers consumers’ none balanced behavior even though behavioral biases might not continuously be steadily called or officially labeled.

Keywords: behavioral economics, digital marketing, marketing strategy, high impact strategies

Procedia PDF Downloads 172
1059 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector

Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau

Abstract:

Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.

Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement

Procedia PDF Downloads 190
1058 The Reality of the Digital Inequality and Its Negative Impact on Virtual Learning during the COVID-19 Pandemic: The South African Perspective

Authors: Jacob Medupe

Abstract:

Life as we know it has changed since the global outbreak of Coronavirus Disease 2019 (COVID-19) and business as usual will not continue. The human impact of the COVID-19 crisis is already immeasurable. Moreover, COVID-19 has already negatively impacted economies, livelihoods and disrupted food systems around the world. The disruptive nature of the Corona virus has affected every sphere of life including the culture and teaching and learning. Right now the majority of education research is based around classroom management techniques that are no longer necessary with digital delivery. Instead there is a great need for new data about how to make the best use of the one-on-one attention that is now becoming possible (Diamandis & Kotler, 2014). The COVID-19 pandemic has necessitated an environment where the South African learners are focused to adhere to social distancing in order to minimise the wild spread of the Corona virus. This arrangement forces the student to utilise the online classroom technologies to continue with the lessons. The historical reality is that the country has not made much strides on the closing of the digital divide and this is particularly a common status quo in the deep rural areas. This will prove to be a toll order for most of the learners affected by the Corona Virus to be able to have a seamless access to the online learning facilities. The paper will seek to look deeply into this reality and how the Corona virus has brought us to the reality that South Africa remains a deeply unequal society in every sphere of life. The study will also explore the state of readiness for education system around the online classroom environment.

Keywords: virtual learning, virtual classroom, COVID-19, Corona virus, internet connectivity, blended learning, online learning, distance education, e-learning, self-regulated Learning, pedagogy, digital literacy

Procedia PDF Downloads 116
1057 Bacterial Exposure and Microbial Activity in Dental Clinics during Cleaning Procedures

Authors: Atin Adhikari, Sushma Kurella, Pratik Banerjee, Nabanita Mukherjee, Yamini M. Chandana Gollapudi, Bushra Shah

Abstract:

Different sharp instruments, drilling machines, and high speed rotary instruments are routinely used in dental clinics during dental cleaning. Therefore, these cleaning procedures release a lot of oral microorganisms including bacteria in clinic air and may cause significant occupational bioaerosol exposure risks for dentists, dental hygienists, patients, and dental clinic employees. Two major goals of this study were to quantify volumetric airborne concentrations of bacteria and to assess overall microbial activity in this type of occupational environment. The study was conducted in several dental clinics of southern Georgia and 15 dental cleaning procedures were targeted for sampling of airborne bacteria and testing of overall microbial activity in settled dusts over clinic floors. For air sampling, a Biostage viable cascade impactor was utilized, which comprises an inlet cone, precision-drilled 400-hole impactor stage, and a base that holds an agar plate (Tryptic soy agar). A high-flow Quick-Take-30 pump connected to this impactor pulls microorganisms in air at 28.3 L/min flow rate through the holes (jets) where they are collected on the agar surface for approx. five minutes. After sampling, agar plates containing the samples were placed in an ice chest with blue ice and plates were incubated at 30±2°C for 24 to 72 h. Colonies were counted and converted to airborne concentrations (CFU/m3) followed by positive hole corrections. Most abundant bacterial colonies (selected by visual screening) were identified by PCR amplicon sequencing of 16S rRNA genes. For understanding overall microbial activity in clinic floors and estimating a general cleanliness of the clinic surfaces during or after dental cleaning procedures, ATP levels were determined in swabbed dust samples collected from 10 cm2 floor surfaces. Concentration of ATP may indicate both the cell viability and the metabolic status of settled microorganisms in this situation. An ATP measuring kit was used, which utilized standard luciferin-luciferase fluorescence reaction and a luminometer, which quantified ATP levels as relative light units (RLU). Three air and dust samples were collected during each cleaning procedure (at the beginning, during cleaning, and immediately after the procedure was completed (n = 45). Concentrations at the beginning, during, and after dental cleaning procedures were 671±525, 917±1203, and 899±823 CFU/m3, respectively for airborne bacteria and 91±101, 243±129, and 139±77 RLU/sample, respectively for ATP levels. The concentrations of bacteria were significantly higher than typical indoor residential environments. Although an increasing trend for airborne bacteria was observed during cleaning, the data collected at three different time points were not significantly different (ANOVA: p = 0.38) probably due to high standard deviations of data. The ATP levels, however, demonstrated a significant difference (ANOVA: p <0.05) in this scenario indicating significant change in microbial activity on floor surfaces during dental cleaning. The most common bacterial genera identified were: Neisseria sp., Streptococcus sp., Chryseobacterium sp., Paenisporosarcina sp., and Vibrio sp. in terms of frequencies of occurrences, respectively. The study concluded that bacterial exposure in dental clinics could be a notable occupational biohazard, and appropriate respiratory protections for the employees are urgently needed.

Keywords: bioaerosols, hospital hygiene, indoor air quality, occupational biohazards

Procedia PDF Downloads 304
1056 Non-Revenue Water Management in Palestine

Authors: Samah Jawad Jabari

Abstract:

Water is the most important and valuable resource not only for human life but also for all living things on the planet. The water supply utilities should fulfill the water requirement quantitatively and qualitatively. Drinking water systems are exposed to both natural (hurricanes and flood) and manmade hazards (risks) that are common in Palestine. Non-Revenue Water (NRW) is a manmade risk which remains a major concern in Palestine, as the NRW levels are estimated to be at a high level. In this research, Hebron city water distribution network was taken as a case study to estimate and audit the NRW levels. The research also investigated the state of the existing water distribution system in the study area by investigating the water losses and obtained more information on NRW prevention and management practices. Data and information have been collected from the Palestinian Water Authority (PWA) and Hebron Municipality (HM) archive. In addition to that, a questionnaire has been designed and administered by the researcher in order to collect the necessary data for water auditing. The questionnaire also assessed the views of stakeholder in PWA and HM (staff) on the current status of the NRW in the Hebron water distribution system. The important result obtained by this research shows that NRW in Hebron city was high and in excess of 30%. The main factors that contribute to NRW were the inaccuracies in billing volumes, unauthorized consumption, and the method of estimating consumptions through faulty meters. Policy for NRW reduction is available in Palestine; however, it is clear that the number of qualified staff available to carry out the activities related to leak detection is low, and that there is a lack of appropriate technologies to reduce water losses and undertake sufficient system maintenance, which needs to be improved to enhance the performance of the network and decrease the level of NRW losses.

Keywords: non-revenue water, water auditing, leak detection, water meters

Procedia PDF Downloads 284
1055 On the Influence of Sleep Habits for Predicting Preterm Births: A Machine Learning Approach

Authors: C. Fernandez-Plaza, I. Abad, E. Diaz, I. Diaz

Abstract:

Births occurring before the 37th week of gestation are considered preterm births. A threat of preterm is defined as the beginning of regular uterine contractions, dilation and cervical effacement between 23 and 36 gestation weeks. To author's best knowledge, the factors that determine the beginning of the birth are not completely defined yet. In particular, the incidence of sleep habits on preterm births is weekly studied. The aim of this study is to develop a model to predict the factors affecting premature delivery on pregnancy, based on the above potential risk factors, including those derived from sleep habits and light exposure at night (introduced as 12 variables obtained by a telephone survey using two questionnaires previously used by other authors). Thus, three groups of variables were included in the study (maternal, fetal and sleep habits). The study was approved by Research Ethics Committee of the Principado of Asturias (Spain). An observational, retrospective and descriptive study was performed with 481 births between January 1, 2015 and May 10, 2016 in the University Central Hospital of Asturias (Spain). A statistical analysis using SPSS was carried out to compare qualitative and quantitative variables between preterm and term delivery. Chi-square test qualitative variable and t-test for quantitative variables were applied. Statistically significant differences (p < 0.05) between preterm vs. term births were found for primiparity, multi-parity, kind of conception, place of residence or premature rupture of membranes and interruption during nights. In addition to the statistical analysis, machine learning methods to look for a prediction model were tested. In particular, tree based models were applied as the trade-off between performance and interpretability is especially suitable for this study. C5.0, recursive partitioning, random forest and tree bag models were analysed using caret R-package. Cross validation with 10-folds and parameter tuning to optimize the methods were applied. In addition, different noise reduction methods were applied to the initial data using NoiseFiltersR package. The best performance was obtained by C5.0 method with Accuracy 0.91, Sensitivity 0.93, Specificity 0.89 and Precision 0.91. Some well known preterm birth factors were identified: Cervix Dilation, maternal BMI, Premature rupture of membranes or nuchal translucency analysis in the first trimester. The model also identifies other new factors related to sleep habits such as light through window, bedtime on working days, usage of electronic devices before sleeping from Mondays to Fridays or change of sleeping habits reflected in the number of hours, in the depth of sleep or in the lighting of the room. IF dilation < = 2.95 AND usage of electronic devices before sleeping from Mondays to Friday = YES and change of sleeping habits = YES, then preterm is one of the predicting rules obtained by C5.0. In this work a model for predicting preterm births is developed. It is based on machine learning together with noise reduction techniques. The method maximizing the performance is the one selected. This model shows the influence of variables related to sleep habits in preterm prediction.

Keywords: machine learning, noise reduction, preterm birth, sleep habit

Procedia PDF Downloads 135
1054 Pharmacological Activities and Potential Uses of Cyperus Rotundus: A Review

Authors: Arslan Masood Pirzada, Muhammad Naeem, Hafiz Haider Ali, Muhammad Latif, Aown Sammar Raza, Asad Hussain Bukhari, Muhammad Saqib, Muhammad Ijaz

Abstract:

Cyperus rotundus (Cyperaceae), a medicinal herb, is being traditionally used as a home remedy for the treatment of various clinical conditions like diarrhea, diabetic, pyretic, inflammation, malaria, and for treating stomach and bowel disorders. Its current status is one of the most widespread, troublesome, and economically damaging agronomic weeds, growing wildly in various tropical and sub-tropical regions of the world. Tuber and rhizomes of Cyperus rotundus possess a higher concentration of active ingredients in the form of essential oils, phenolic acids, ascorbic acids and flavonoids, responsible for its remedial properties. Exploitation of any medicinal plant application depends on the crucial and comprehensive information about the therapeutic potential of a plant. Researchers have evaluated and characterized the significance of Cyperus rotundus as an anti-androgenic, anti-bacterial, anti-cancerous, anti-convulsant, anti-diabetic, anti-diarrheal, anti-genotoxic, anti-inflammatory, anti-lipidemic, anti-malarial, anti-mutagenic, anti-obesity, anti-oxidant, anti-uropathogenic, hepato-, cardio-, neuroprotective, and nootropic agent. This paper comprises a broad review to summarize the current state of knowledge about chemical constituents, potential economic uses and therapeutic aspects of Cyperus rotundus that will aid in the development of bioethanol and modern herbal medicine through latest technologies that will promote the ability of this plant in the cure of many clinical disorders.

Keywords: purple nutsedge, chemical composition, economic uses, therapeutic values, future directions

Procedia PDF Downloads 504
1053 Challenges and Pedagogical Strategies in Teaching Chemical Bonding: Perspectives from Moroccan Educators

Authors: Sara atibi, Azzeddine Atibi, Salim Ahmed, Khadija El Kababi

Abstract:

The concept of chemical bonding is fundamental in chemistry education, ubiquitous in school curricula, and essential to numerous topics in the field. Mastery of this concept enables students to predict and explain the physical and chemical properties of substances. However, chemical bonding is often regarded as one of the most complex concepts for secondary and higher education students to comprehend, due to the underlying complex theory and the use of abstract models. Teachers also encounter significant challenges in conveying this concept effectively. This study aims to identify the difficulties and alternative conceptions faced by Moroccan secondary school students in learning about chemical bonding, as well as the pedagogical strategies employed by teachers to overcome these obstacles. A survey was conducted involving 150 Moroccan secondary school physical science teachers, using a structured questionnaire comprising closed, open-ended, and multiple-choice questions. The results reveal frequent student misconceptions, such as the octet rule, molecular geometry, and molecular polarity. Contributing factors to these misconceptions include the abstract nature of the concepts, the use of models, and teachers' difficulties in explaining certain aspects of chemical bonding. The study proposes improvements for teaching chemical bonding, such as integrating information and communication technologies (ICT), diversifying pedagogical tools, and considering students' pre-existing conceptions. These recommendations aim to assist teachers, curriculum developers, and textbook authors in making chemistry more accessible and in addressing students' misconceptions.

Keywords: chemical bonding, alternative conceptions, chemistry education, pedagogical strategies

Procedia PDF Downloads 12
1052 Deep Reinforcement Learning-Based Computation Offloading for 5G Vehicle-Aware Multi-Access Edge Computing Network

Authors: Ziying Wu, Danfeng Yan

Abstract:

Multi-Access Edge Computing (MEC) is one of the key technologies of the future 5G network. By deploying edge computing centers at the edge of wireless access network, the computation tasks can be offloaded to edge servers rather than the remote cloud server to meet the requirements of 5G low-latency and high-reliability application scenarios. Meanwhile, with the development of IOV (Internet of Vehicles) technology, various delay-sensitive and compute-intensive in-vehicle applications continue to appear. Compared with traditional internet business, these computation tasks have higher processing priority and lower delay requirements. In this paper, we design a 5G-based Vehicle-Aware Multi-Access Edge Computing Network (VAMECN) and propose a joint optimization problem of minimizing total system cost. In view of the problem, a deep reinforcement learning-based joint computation offloading and task migration optimization (JCOTM) algorithm is proposed, considering the influences of multiple factors such as concurrent multiple computation tasks, system computing resources distribution, and network communication bandwidth. And, the mixed integer nonlinear programming problem is described as a Markov Decision Process. Experiments show that our proposed algorithm can effectively reduce task processing delay and equipment energy consumption, optimize computing offloading and resource allocation schemes, and improve system resource utilization, compared with other computing offloading policies.

Keywords: multi-access edge computing, computation offloading, 5th generation, vehicle-aware, deep reinforcement learning, deep q-network

Procedia PDF Downloads 99
1051 Academic Staff Perspective of Adoption of Augmented Reality in Teaching Practice to Support Students Learning Remotely in a Crisis Time in Higher

Authors: Ebtisam Alqahtani

Abstract:

The purpose of this study is to investigate academic staff perspectives on using Augmented Reality in teaching practice to support students learning remotely during the COVID pandemic. the study adopted the DTPB theoretical model to guide the identification of key potential factors that could motivate academic staff to use or not use AR in teaching practices. A mixing method design was adopted for a better understanding of the study problem. A survey was completed by 851 academic staff, and this was followed by interviews with 20 academic staff. Statistical analyses were used to assess the survey data, and thematic analysis was used to assess the interview data. The study finding indicates that 75% of academic staff were aware of AR as a pedagogical tool, and they agreed on the potential benefits of AR in teaching and learning practices. However, 36% of academic staff use it in teaching and learning practice, and most of them agree with most of the potential barriers to adopting AR in educational environments. In addition, the study results indicate that 91% of them are planning to use it in the future. The most important factors that motivated them to use it in the future are the COVID pandemic factor, hedonic motivation factor, and academic staff attitude factor. The perceptions of academic staff differed according to the universities they attended, the faculties they worked in, and their gender. This study offers further empirical support for the DTPB model, as well as recommendations to help higher education implement technology in its educational environment based on the findings of the study. It is unprecedented the study the necessity of the use of AR technologies in the time of Covid-19. Therefore, the contribution is both theoretical and practice

Keywords: higher education, academic staff, AR technology as pedological tools, teaching and learning practice, benefits of AR, barriers of adopting AR, and motivating factors to adopt AR

Procedia PDF Downloads 117
1050 Statistical Analysis and Optimization of a Process for CO2 Capture

Authors: Muftah H. El-Naas, Ameera F. Mohammad, Mabruk I. Suleiman, Mohamed Al Musharfy, Ali H. Al-Marzouqi

Abstract:

CO2 capture and storage technologies play a significant role in contributing to the control of climate change through the reduction of carbon dioxide emissions into the atmosphere. The present study evaluates and optimizes CO2 capture through a process, where carbon dioxide is passed into pH adjusted high salinity water and reacted with sodium chloride to form a precipitate of sodium bicarbonate. This process is based on a modified Solvay process with higher CO2 capture efficiency, higher sodium removal, and higher pH level without the use of ammonia. The process was tested in a bubble column semi-batch reactor and was optimized using response surface methodology (RSM). CO2 capture efficiency and sodium removal were optimized in terms of major operating parameters based on four levels and variables in Central Composite Design (CCD). The operating parameters were gas flow rate (0.5–1.5 L/min), reactor temperature (10 to 50 oC), buffer concentration (0.2-2.6%) and water salinity (25-197 g NaCl/L). The experimental data were fitted to a second-order polynomial using multiple regression and analyzed using analysis of variance (ANOVA). The optimum values of the selected variables were obtained using response optimizer. The optimum conditions were tested experimentally using desalination reject brine with salinity ranging from 65,000 to 75,000 mg/L. The CO2 capture efficiency in 180 min was 99% and the maximum sodium removal was 35%. The experimental and predicted values were within 95% confidence interval, which demonstrates that the developed model can successfully predict the capture efficiency and sodium removal using the modified Solvay method.

Keywords: CO2 capture, water desalination, Response Surface Methodology, bubble column reactor

Procedia PDF Downloads 280
1049 Design and Implementation of a Software Platform Based on Artificial Intelligence for Product Recommendation

Authors: Giuseppina Settanni, Antonio Panarese, Raffaele Vaira, Maurizio Galiano

Abstract:

Nowdays, artificial intelligence is used successfully in academia and industry for its ability to learn from a large amount of data. In particular, in recent years the use of machine learning algorithms in the field of e-commerce has spread worldwide. In this research study, a prototype software platform was designed and implemented in order to suggest to users the most suitable products for their needs. The platform includes a chatbot and a recommender system based on artificial intelligence algorithms that provide suggestions and decision support to the customer. The recommendation systems perform the important function of automatically filtering and personalizing information, thus allowing to manage with the IT overload to which the user is exposed on a daily basis. Recently, international research has experimented with the use of machine learning technologies with the aim to increase the potential of traditional recommendation systems. Specifically, support vector machine algorithms have been implemented combined with natural language processing techniques that allow the user to interact with the system, express their requests and receive suggestions. The interested user can access the web platform on the internet using a computer, tablet or mobile phone, register, provide the necessary information and view the products that the system deems them most appropriate. The platform also integrates a dashboard that allows the use of the various functions, which the platform is equipped with, in an intuitive and simple way. Artificial intelligence algorithms have been implemented and trained on historical data collected from user browsing. Finally, the testing phase allowed to validate the implemented model, which will be further tested by letting customers use it.

Keywords: machine learning, recommender system, software platform, support vector machine

Procedia PDF Downloads 127
1048 A Review on Benzo(a)pyrene Emission Factors from Biomass Combustion

Authors: Franziska Klauser, Manuel Schwabl, Alexander Weissinger, Christoph Schmidl, Walter Haslinger, Anne Kasper-Giebl

Abstract:

Benzo(a)pyrene (BaP) is the most widely investigated representative of Polycyclic Aromatic Hydrocarbons (PAH) as well as one of the most toxic compounds in this group. Since 2013 in the European Union a limit value for BaP concentration in the ambient air is applied, which was set to a yearly average value of 1 ng m-3. Several reports show that in some regions, even where industry and traffic are of minor impact this threshold is regularly exceeded. This is taken as proof that biomass combustion for heating purposes contributes significantly to BaP pollution. Several investigations have been already carried out on the BaP emission behavior of biomass combustion furnaces, mostly focusing on a certain aspect like the influences from wood type, of operation type or of technology type. However, a superior view on emission patterns of BaP from biomass combustion and the aggregation of determined values also from recent studies is not presented so far. The combination of determined values allows a better understanding of the BaP emission behavior from biomass combustion. In this work the review conclusions are driven from the combination of outcomes from different publication. In two examples it was shown that technical progress leads to 10 to 100 fold lower BaP emission from modern furnaces compared to old technologies of equivalent type. It was also indicated that the operation with pellets or wood chips exhibits clearly lower BaP emission factors compared to operation with log wood. Although, the BaP emission level from automatic furnaces is strongly impacted by the kind of operation. This work delivers an overview on BaP emission factors from different biomass combustion appliances, from different operation modes and from the combustion of different fuel and wood types. The main impact factors are depicted, and suggestions for low BaP emission biomass combustion are derived. As one result possible investigation fields concerning BaP emissions from biomass combustion that seem to be most important to be clarified are suggested.

Keywords: benzo(a)pyrene, biomass, combustion, emission, pollution

Procedia PDF Downloads 349
1047 C-eXpress: A Web-Based Analysis Platform for Comparative Functional Genomics and Proteomics in Human Cancer Cell Line, NCI-60 as an Example

Authors: Chi-Ching Lee, Po-Jung Huang, Kuo-Yang Huang, Petrus Tang

Abstract:

Background: Recent advances in high-throughput research technologies such as new-generation sequencing and multi-dimensional liquid chromatography makes it possible to dissect the complete transcriptome and proteome in a single run for the first time. However, it is almost impossible for many laboratories to handle and analysis these “BIG” data without the support from a bioinformatics team. We aimed to provide a web-based analysis platform for users with only limited knowledge on bio-computing to study the functional genomics and proteomics. Method: We use NCI-60 as an example dataset to demonstrate the power of the web-based analysis platform and data delivering system: C-eXpress takes a simple text file that contain the standard NCBI gene or protein ID and expression levels (rpkm or fold) as input file to generate a distribution map of gene/protein expression levels in a heatmap diagram organized by color gradients. The diagram is hyper-linked to a dynamic html table that allows the users to filter the datasets based on various gene features. A dynamic summary chart is generated automatically after each filtering process. Results: We implemented an integrated database that contain pre-defined annotations such as gene/protein properties (ID, name, length, MW, pI); pathways based on KEGG and GO biological process; subcellular localization based on GO cellular component; functional classification based on GO molecular function, kinase, peptidase and transporter. Multiple ways of sorting of column and rows is also provided for comparative analysis and visualization of multiple samples.

Keywords: cancer, visualization, database, functional annotation

Procedia PDF Downloads 608
1046 Effective Water Purification by Impregnated Carbon Nanotubes

Authors: Raviteja Chintala

Abstract:

Water shortage in many areas of the world have predominantly increased the demand for efficient methods involved in the production of drinking water, So purification of water invoking cost effective and efficient methods is a challenging field of research. In this regard, Reverse osmosis membrane desalination of both seawater and inland brackish water is currently being deployed in various locations around the world. In the present work an attempt is made to integrate these existing technologies with novel method, Wherein carbon nanotubes at the lab scale are prepared which further replace activated carbon tubes being used traditionally. This has proven to enhance the efficiency of the water filter, Effectively neutralising most of the organic impurities. Furthermore, This ensures the reduction in TDS. Carbon nanotubes have wide range in scope of applications such as composite reinforcements, Field emitters, Sensors, Energy storage and energy conversion devices and catalysts support phases, Because of their unusual mechanical, Electrical, Thermal and structural properties. In particular, The large specific surface area, as well as the high chemical and thermal stability, Makes carbon nanotube an attractive adsorbent in waste water treatment. Carbon nanotubes are effective in eliminating these harmful media from water as an adsorbent. In this work, Candle soot method has been incorporated for the preparation of carbon nanotubes and mixed with activated charcoal in different compositions. The effect of composition change is monitored by using TDS measuring meter. As the composition of Nano carbon increases, The TDS of the water gradually decreases. In order to enhance the life time for carbon filter, Nano tubes are provided with larger surface area.

Keywords: TDS (Total Dissolved Solids), carbon nanotubes, water, candle soot

Procedia PDF Downloads 330
1045 The Development of Student Core Competencies through the STEM Education Opportunities in Classroom

Authors: Z. Dedovets, M. Rodionov

Abstract:

The goal of the modern education system is to prepare students to be able to adapt to ever-changing life situations. They must be able to acquire required knowledge independently; apply such knowledge in practice to solve various problems by using modern technologies; think critically and creatively; competently use information; be communicative, work in a team; and develop their own moral values, intellect and cultural awareness. As a result, the status of education significantly increases; new requirements to its quality have been formed. In recent years, the competency-based approach in education has become of significant interest. This approach is a strengthening of applied and practical characteristics of a school education and leads to the forming of the key students’ competencies which define their success in future life. In this article, the authors’ attention focuses on a range of key competencies, educational, informational and communicative and on the possibility to develop such competencies via STEM education. This research shows the change in students’ attitude towards scientific disciplines such as mathematics, general science, technology and engineering as a result of STEM education. Two-staged analyzes questionnaires completed by students of forms II to IV in the republic of Trinidad and Tobago allowed the authors to categorize students between two levels that represent students’ attitude to various disciplines. The significance of differences between selected levels was confirmed with the use of Pearsons’ chi-squared test. In summary, the analysis of obtained data makes it possible to conclude that STEM education has a great potential for development of core students’ competencies and encourages the development of positive student attitude towards the above mentioned above scientific disciplines.

Keywords: STEM, science, technology, engineering, mathematics, students’ competency, Pearson's chi-squared test

Procedia PDF Downloads 379
1044 Understanding Cyber Kill Chains: Optimal Allocation of Monitoring Resources Using Cooperative Game Theory

Authors: Roy. H. A. Lindelauf

Abstract:

Cyberattacks are complex processes consisting of multiple interwoven tasks conducted by a set of agents. Interdictions and defenses against such attacks often rely on cyber kill chain (CKC) models. A CKC is a framework that tries to capture the actions taken by a cyber attacker. There exists a growing body of literature on CKCs. Most of this work either a) describes the CKC with respect to one or more specific cyberattacks or b) discusses the tools and technologies used by the attacker at each stage of the CKC. Defenders, facing scarce resources, have to decide where to allocate their resources given the CKC and partial knowledge on the tools and techniques attackers use. In this presentation CKCs are analyzed through the lens of covert projects, i.e., interrelated tasks that have to be conducted by agents (human and/or computer) with the aim of going undetected. Various aspects of covert project models have been studied abundantly in the operations research and game theory domain, think of resource-limited interdiction actions that maximally delay completion times of a weapons project for instance. This presentation has investigated both cooperative and non-cooperative game theoretic covert project models and elucidated their relation to CKC modelling. To view a CKC as a covert project each step in the CKC is broken down into tasks and there are players of which each one is capable of executing a subset of the tasks. Additionally, task inter-dependencies are represented by a schedule. Using multi-glove cooperative games it is shown how a defender can optimize the allocation of his scarce resources (what, where and how to monitor) against an attacker scheduling a CKC. This study presents and compares several cooperative game theoretic solution concepts as metrics for assigning resources to the monitoring of agents.

Keywords: cyber defense, cyber kill chain, game theory, information warfare techniques

Procedia PDF Downloads 131
1043 A QoS Aware Cluster Based Routing Algorithm for Wireless Mesh Network Using LZW Lossless Compression

Authors: J. S. Saini, P. P. K. Sandhu

Abstract:

The multi-hop nature of Wireless Mesh Networks and the hasty progression of throughput demands results in multi- channels and multi-radios structures in mesh networks, but the main problem of co-channels interference reduces the total throughput, specifically in multi-hop networks. Quality of Service mentions a vast collection of networking technologies and techniques that guarantee the ability of a network to make available desired services with predictable results. Quality of Service (QoS) can be directed at a network interface, towards a specific server or router's performance, or in specific applications. Due to interference among various transmissions, the QoS routing in multi-hop wireless networks is formidable task. In case of multi-channel wireless network, since two transmissions using the same channel may interfere with each other. This paper has considered the Destination Sequenced Distance Vector (DSDV) routing protocol to locate the secure and optimised path. The proposed technique also utilizes the Lempel–Ziv–Welch (LZW) based lossless data compression and intra cluster data aggregation to enhance the communication between the source and the destination. The use of clustering has the ability to aggregate the multiple packets and locates a single route using the clusters to improve the intra cluster data aggregation. The use of the LZW based lossless data compression has ability to reduce the data packet size and hence it will consume less energy, thus increasing the network QoS. The MATLAB tool has been used to evaluate the effectiveness of the projected technique. The comparative analysis has shown that the proposed technique outperforms over the existing techniques.

Keywords: WMNS, QOS, flooding, collision avoidance, LZW, congestion control

Procedia PDF Downloads 332