Search results for: time delay neural network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22273

Search results for: time delay neural network

18163 Dynamics of Light Induced Current in 1D Coupled Quantum Dots

Authors: Tokuei Sako

Abstract:

Laser-induced current in a quasi-one-dimensional nanostructure has been studied by a model of a few electrons confined in a 1D electrostatic potential coupled to electrodes at both ends and subjected to a pulsed laser field. The time-propagation of the one- and two-electron wave packets has been calculated by integrating the time-dependent Schrödinger equation directly by the symplectic integrator method with uniform Fourier grid. The temporal behavior of the resultant light-induced current in the studied systems has been discussed with respect to the lifetime of the quasi-bound states formed when the static bias voltage is applied.

Keywords: pulsed laser field, nanowire, electron wave packet, quantum dots, time-dependent Schrödinger equation

Procedia PDF Downloads 357
18162 Research on the Strategy of Old City Reconstruction under Market Orientation: Taking Mutoulong Community in Shenzhen as an Example

Authors: Ziwei Huang

Abstract:

In order to promote Inventory development in Shenzhen, the market-oriented real estate development mode has occupied a dominant position in the urban renewal activities of Shenzhen. This research is based on the theory of role relationship and urban regime, taking the Mutoulong community as the research object. Carries on the case depth analysis found that: Under the situation of absence and dislocation of the government's role, land property rights disputes and lack of communication platforms is the main reason for the problems of nail households and market failures, and the long-term delay in the progress of old city reconstruction. Through the analysis of the cause of the transformation problem and the upper planning and interest coordination mechanism, the optimization strategy of the old city transformation is finally proposed as follows: the establishment of interest coordination platform, the risk assessment of the government's intervention in the preliminary construction of the land, the adaptive construction of laws and regulations, and the re-examination of the interest relationship between the government and the market.

Keywords: Shenzhen city, Mutoulong community, urban regeneration, urban regime theory, role relationship theory

Procedia PDF Downloads 96
18161 Earthquake Relocations and Constraints on the Lateral Velocity Variations along the Gulf of Suez, Using the Modified Joint Hypocenter Method Determination

Authors: Abu Bakr Ahmed Shater

Abstract:

Hypocenters of 250 earthquakes recorded by more than 5 stations from the Egyptian seismic network around the Gulf of Suez were relocated and the seismic stations correction for the P-wave is estimated, using the modified joint hypocenter method determination. Five stations TR1, SHR, GRB, ZAF and ZET have minus signs in the station P-wave travel time corrections and their values are -0.235, -0.366, -0.288, -0.366 and -0.058, respectively. It is possible to assume that, the underground model in this area has a particular characteristic of high velocity structure in which the other stations TR2, RDS, SUZ, HRG and ZNM have positive signs and their values are 0.024, 0.187, 0.314, 0.645 and 0.145, respectively. It is possible to assume that, the underground model in this area has particular characteristic of low velocity structure. The hypocenteral location determined by the Modified joint hypocenter method is more precise than those determined by the other routine work program. This method simultaneously solves the earthquake locations and station corrections. The station corrections reflect, not only the different crustal conditions in the vicinity of the stations, but also the difference between the actual and modeled seismic velocities along each of the earthquake - station ray paths. The stations correction obtained is correlated with the major surface geological features in the study area. As a result of the relocation, the low velocity area appears in the northeastern and southwestern sides of the Gulf of Suez, while the southeastern and northwestern parts are of high velocity area.

Keywords: gulf of Suez, seismicity, relocation of hypocenter, joint hypocenter determination

Procedia PDF Downloads 358
18160 Effect of Water Absorption on the Fatigue Behavior of Glass/Polyester Composite

Authors: Djamel Djeghader, Bachir Redjel

Abstract:

The composite materials of glass fibers can be used as a repair material for damage elements under repeated stresses, and in various environments. A cyclic bending characterization of a glass/polyester composite material was carried out with consideration of the period of immersion in water. These tests describe the behavior of materials and identify the mechanical fatigue characteristics using the Wohler Curve for different immersion time: 0, 90, 180 and 270 days in water. These curves are characterized by a dispersion in the lifetimes were modeled by straight whose intercepts are very similar and comparable to the static strength. This material deteriorates fatigue at a constant rate, which increases with increasing immersion time in water at a constant speed. The endurance limit seems to be independent of the immersion time in the water.

Keywords: fatigue, composite, glass, polyester, immersion, wohler

Procedia PDF Downloads 314
18159 Transient Heat Conduction in Nonuniform Hollow Cylinders with Time Dependent Boundary Condition at One Surface

Authors: Sen Yung Lee, Chih Cheng Huang, Te Wen Tu

Abstract:

A solution methodology without using integral transformation is proposed to develop analytical solutions for transient heat conduction in nonuniform hollow cylinders with time-dependent boundary condition at the outer surface. It is shown that if the thermal conductivity and the specific heat of the medium are in arbitrary polynomial function forms, the closed solutions of the system can be developed. The influence of physical properties on the temperature distribution of the system is studied. A numerical example is given to illustrate the efficiency and the accuracy of the solution methodology.

Keywords: analytical solution, nonuniform hollow cylinder, time-dependent boundary condition, transient heat conduction

Procedia PDF Downloads 505
18158 Outcome of Bowel Management Program in Patient with Spinal Cord Injury

Authors: Roongtiwa Chobchuen, Angkana Srikhan, Pattra Wattanapan

Abstract:

Background: Neurogenic bowel is common condition after spinal cord injury. Most of spinal cord injured patients have motor weakness, mobility impairment which leads to constipation. Moreover, the neural pathway involving bowel function is interrupted. Therefore, the bowel management program should be implemented in nursing care in the earliest time after the onset of the disease to prevent the morbidity and mortality. Objective: To study the outcome of bowel management program of the patients with spinal cord injury who admitted for rehabilitation program. Study design: Descriptive study. Setting: Rehabilitation ward in Srinagarind Hospital. Populations: patients with subacute to chronic spinal cord injury who admitted at rehabilitation ward, Srinagarind hospital, aged over 18 years old. Instrument: The neurogenic bowel dysfunction score (NBDS) was used to determine the severity of neurogenic bowel. Procedure and statistical analysis: All participants were asked to complete the demographic data; age gender, duration of disease, diagnosis. The individual bowel function was assessed using NBDS at admission. The patients and caregivers were trained by nurses about the bowel management program which consisted of diet modification, abdominal massage, digital stimulation, stool evacuation including medication and physical activity. The outcome of the bowel management program was assessed by NBDS at discharge. The chi-square test was used to detect the difference in severity of neurogenic bowel at admission and discharge. Results: Sixteen spinal cord injured patients were enrolled in the study (age 45 ± 17 years old, 69% were male). Most of them (50%) were tetraplegia. On the admission, 12.5%, 12.5%, 43.75% and 31.25% were categorized as very minor (NBDS 0-6), minor (NBDS 7-9), moderate (NBDS 10-13) and severe (NBDS 14+) respectively. The severity of neurogenic bowel was decreased significantly at discharge (56.25%, 18.755%, 18.75% and 6.25% for very minor, minor, moderate and severe group respectively; p < 0.001) compared with NBDS at admission. Conclusions: Implementation of the effective bowel program decrease the severity of the neurogenic bowel in patient with spinal cord injury.

Keywords: neurogenic bowel, NBDS, spinal cord injury, bowel program

Procedia PDF Downloads 243
18157 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality

Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan

Abstract:

Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.

Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application

Procedia PDF Downloads 74
18156 Cybersecurity Strategies for Protecting Oil and Gas Industrial Control Systems

Authors: Gaurav Kumar Sinha

Abstract:

The oil and gas industry is a critical component of the global economy, relying heavily on industrial control systems (ICS) to manage and monitor operations. However, these systems are increasingly becoming targets for cyber-attacks, posing significant risks to operational continuity, safety, and environmental integrity. This paper explores comprehensive cybersecurity strategies for protecting oil and gas industrial control systems. It delves into the unique vulnerabilities of ICS in this sector, including outdated legacy systems, integration with IT networks, and the increased connectivity brought by the Industrial Internet of Things (IIoT). We propose a multi-layered defense approach that includes the implementation of robust network security protocols, regular system updates and patch management, advanced threat detection and response mechanisms, and stringent access control measures. We illustrate the effectiveness of these strategies in mitigating cyber risks and ensuring the resilient and secure operation of oil and gas industrial control systems. The findings underscore the necessity for a proactive and adaptive cybersecurity framework to safeguard critical infrastructure in the face of evolving cyber threats.

Keywords: cybersecurity, industrial control systems, oil and gas, cyber-attacks, network security, IoT, threat detection, system updates, patch management, access control, cybersecurity awareness, critical infrastructure, resilience, cyber threats, legacy systems, IT integration, multi-layered defense, operational continuity, safety, environmental integrity

Procedia PDF Downloads 44
18155 Comparing the Effect of Exercise Time (Morning and Evening) on Troponin T in Males with Cardiovascular Disease

Authors: Amin Mehrabi, Mohsen Salesi, Pourya Pasavand

Abstract:

Context and objective: The purpose of this research is to study the effect of exercise time (morning/evening) on amount of Troponin T in males' plasma suffering from cardiovascular disease. Method: 15 cardiovascular patients selected as the research subjects. At 7 a.m. pretest blood samples taken from the subjects and they did the exercise protocol in presence of a doctor. Immediately after and 3 hours after that blood measurements done. A week later, the subjects did the same steps at 7 p.m. The SPSS v.20 software used to analyze data. Findings: This study proved that circadian rhythm does not have any effect on the response of myocarditis tissue to exercise and cardiovascular patients allowed to exercise in any times of a day.

Keywords: cardiovascular disease, time of exercise, troponin T (cTnT), myocarditis

Procedia PDF Downloads 508
18154 Energy-Aware Scheduling in Real-Time Systems: An Analysis of Fair Share Scheduling and Priority-Driven Preemptive Scheduling

Authors: Su Xiaohan, Jin Chicheng, Liu Yijing, Burra Venkata Durga Kumar

Abstract:

Energy-aware scheduling in real-time systems aims to minimize energy consumption, but issues related to resource reservation and timing constraints remain challenges. This study focuses on analyzing two scheduling algorithms, Fair-Share Scheduling (FFS) and Priority-Driven Preemptive Scheduling (PDPS), for solving these issues and energy-aware scheduling in real-time systems. Based on research on both algorithms and the processes of solving two problems, it can be found that Fair-Share Scheduling ensures fair allocation of resources but needs to improve with an imbalanced system load, and Priority-Driven Preemptive Scheduling prioritizes tasks based on criticality to meet timing constraints through preemption but relies heavily on task prioritization and may not be energy efficient. Therefore, improvements to both algorithms with energy-aware features will be proposed. Future work should focus on developing hybrid scheduling techniques that minimize energy consumption through intelligent task prioritization, resource allocation, and meeting time constraints.

Keywords: energy-aware scheduling, fair-share scheduling, priority-driven preemptive scheduling, real-time systems, optimization, resource reservation, timing constraints

Procedia PDF Downloads 119
18153 New Gas Geothermometers for the Prediction of Subsurface Geothermal Temperatures: An Optimized Application of Artificial Neural Networks and Geochemometric Analysis

Authors: Edgar Santoyo, Daniel Perez-Zarate, Agustin Acevedo, Lorena Diaz-Gonzalez, Mirna Guevara

Abstract:

Four new gas geothermometers have been derived from a multivariate geo chemometric analysis of a geothermal fluid chemistry database, two of which use the natural logarithm of CO₂ and H2S concentrations (mmol/mol), respectively, and the other two use the natural logarithm of the H₂S/H₂ and CO₂/H₂ ratios. As a strict compilation criterion, the database was created with gas-phase composition of fluids and bottomhole temperatures (BHTM) measured in producing wells. The calibration of the geothermometers was based on the geochemical relationship existing between the gas-phase composition of well discharges and the equilibrium temperatures measured at bottomhole conditions. Multivariate statistical analysis together with the use of artificial neural networks (ANN) was successfully applied for correlating the gas-phase compositions and the BHTM. The predicted or simulated bottomhole temperatures (BHTANN), defined as output neurons or simulation targets, were statistically compared with measured temperatures (BHTM). The coefficients of the new geothermometers were obtained from an optimized self-adjusting training algorithm applied to approximately 2,080 ANN architectures with 15,000 simulation iterations each one. The self-adjusting training algorithm used the well-known Levenberg-Marquardt model, which was used to calculate: (i) the number of neurons of the hidden layer; (ii) the training factor and the training patterns of the ANN; (iii) the linear correlation coefficient, R; (iv) the synaptic weighting coefficients; and (v) the statistical parameter, Root Mean Squared Error (RMSE) to evaluate the prediction performance between the BHTM and the simulated BHTANN. The prediction performance of the new gas geothermometers together with those predictions inferred from sixteen well-known gas geothermometers (previously developed) was statistically evaluated by using an external database for avoiding a bias problem. Statistical evaluation was performed through the analysis of the lowest RMSE values computed among the predictions of all the gas geothermometers. The new gas geothermometers developed in this work have been successfully used for predicting subsurface temperatures in high-temperature geothermal systems of Mexico (e.g., Los Azufres, Mich., Los Humeros, Pue., and Cerro Prieto, B.C.) as well as in a blind geothermal system (known as Acoculco, Puebla). The last results of the gas geothermometers (inferred from gas-phase compositions of soil-gas bubble emissions) compare well with the temperature measured in two wells of the blind geothermal system of Acoculco, Puebla (México). Details of this new development are outlined in the present research work. Acknowledgements: The authors acknowledge the funding received from CeMIE-Geo P09 project (SENER-CONACyT).

Keywords: artificial intelligence, gas geochemistry, geochemometrics, geothermal energy

Procedia PDF Downloads 352
18152 Learning to Translate by Learning to Communicate to an Entailment Classifier

Authors: Szymon Rutkowski, Tomasz Korbak

Abstract:

We present a reinforcement-learning-based method of training neural machine translation models without parallel corpora. The standard encoder-decoder approach to machine translation suffers from two problems we aim to address. First, it needs parallel corpora, which are scarce, especially for low-resource languages. Second, it lacks psychological plausibility of learning procedure: learning a foreign language is about learning to communicate useful information, not merely learning to transduce from one language’s 'encoding' to another. We instead pose the problem of learning to translate as learning a policy in a communication game between two agents: the translator and the classifier. The classifier is trained beforehand on a natural language inference task (determining the entailment relation between a premise and a hypothesis) in the target language. The translator produces a sequence of actions that correspond to generating translations of both the hypothesis and premise, which are then passed to the classifier. The translator is rewarded for classifier’s performance on determining entailment between sentences translated by the translator to disciple’s native language. Translator’s performance thus reflects its ability to communicate useful information to the classifier. In effect, we train a machine translation model without the need for parallel corpora altogether. While similar reinforcement learning formulations for zero-shot translation were proposed before, there is a number of improvements we introduce. While prior research aimed at grounding the translation task in the physical world by evaluating agents on an image captioning task, we found that using a linguistic task is more sample-efficient. Natural language inference (also known as recognizing textual entailment) captures semantic properties of sentence pairs that are poorly correlated with semantic similarity, thus enforcing basic understanding of the role played by compositionality. It has been shown that models trained recognizing textual entailment produce high-quality general-purpose sentence embeddings transferrable to other tasks. We use stanford natural language inference (SNLI) dataset as well as its analogous datasets for French (XNLI) and Polish (CDSCorpus). Textual entailment corpora can be obtained relatively easily for any language, which makes our approach more extensible to low-resource languages than traditional approaches based on parallel corpora. We evaluated a number of reinforcement learning algorithms (including policy gradients and actor-critic) to solve the problem of translator’s policy optimization and found that our attempts yield some promising improvements over previous approaches to reinforcement-learning based zero-shot machine translation.

Keywords: agent-based language learning, low-resource translation, natural language inference, neural machine translation, reinforcement learning

Procedia PDF Downloads 128
18151 Fault Tolerant Control System Using a Multiple Time Scale SMC Technique and a Geometric Approach

Authors: Ghodbane Azeddine, Saad Maarouf, Boland Jean-Francois, Thibeault Claude

Abstract:

This paper proposes a new design of an active fault-tolerant flight control system against abrupt actuator faults. This overall system combines a multiple time scale sliding mode controller for fault compensation and a geometric approach for fault detection and diagnosis. The proposed control system is able to accommodate several kinds of partial and total actuator failures, by using available healthy redundancy actuators. The overall system first estimates the correct fault information using the geometric approach. Then, and based on that, a new reconfigurable control law is designed based on the multiple time scale sliding mode technique for on-line compensating the effect of such faults. This approach takes advantages of the fact that there are significant difference between the time scales of aircraft states that have a slow dynamics and those that have a fast dynamics. The closed-loop stability of the overall system is proved using Lyapunov technique. A case study of the non-linear model of the F16 fighter, subject to the rudder total loss of control confirms the effectiveness of the proposed approach.

Keywords: actuator faults, fault detection and diagnosis, fault tolerant flight control, sliding mode control, multiple time scale approximation, geometric approach for fault reconstruction, lyapunov stability

Procedia PDF Downloads 370
18150 Evaluation of Automated Analyzers of Polycyclic Aromatic Hydrocarbons and Black Carbon in a Coke Oven Plant by Comparison with Analytical Methods

Authors: L. Angiuli, L. Trizio, R. Giua, A. Digilio, M. Tutino, P. Dambruoso, F. Mazzone, C. M. Placentino

Abstract:

In the winter of 2014 a series of measurements were performed to evaluate the behavior of real-time PAHs and black carbon analyzers in a coke oven plant located in Taranto, a city of Southern Italy. Data were collected both insides than outside the plant, at air quality monitoring sites. Contemporary measures of PM2.5 and PM1 were performed. Particle-bound PAHs were measured by two methods: (1) aerosol photoionization using an Ecochem PAS 2000 analyzer, (2) PM2.5 and PM1 quartz filter collection and analysis by gas chromatography/mass spectrometry (GC/MS). Black carbon was determined both in real-time by Magee Aethalometer AE22 analyzer than by semi-continuous Sunset Lab EC/OC instrument. Detected PM2.5 and PM1 levels were higher inside than outside the plant while PAHs real-time values were higher outside than inside. As regards PAHs, inside the plant Ecochem PAS 2000 revealed concentrations not significantly different from those determined on the filter during low polluted days, but at increasing concentrations the automated instrument underestimated PAHs levels. At the external site, Ecochem PAS 2000 real-time concentrations were steadily higher than those on the filter. In the same way, real-time black carbon values were constantly lower than EC concentrations obtained by Sunset EC/OC in the inner site, while outside the plant real-time values were comparable to Sunset EC values. Results showed that in a coke plant real-time analyzers of PAHs and black carbon in the factory configuration provide qualitative information, with no accuracy and leading to the underestimation of the concentration. A site specific calibration is needed for these instruments before their installation in high polluted sites.

Keywords: black carbon, coke oven plant, PAH, PAS, aethalometer

Procedia PDF Downloads 344
18149 Long-Term Structural Behavior of Resilient Materials for Reduction of Floor Impact Sound

Authors: Jung-Yoon Lee, Jongmun Kim, Hyo-Jun Chang, Jung-Min Kim

Abstract:

People’s tendency towards living in apartment houses is increasing in a densely populated country. However, some residents living in apartment houses are bothered by noise coming from the houses above. In order to reduce noise pollution, the communities are increasingly imposing a bylaw, including the limitation of floor impact sound, minimum thickness of floors, and floor soundproofing solutions. This research effort focused on the specific long-time deflection of resilient materials in the floor sound insulation systems of apartment houses. The experimental program consisted of testing nine floor sound insulation specimens subjected to sustained load for 45 days. Two main parameters were considered in the experimental investigation: three types of resilient materials and magnitudes of loads. The test results indicated that the structural behavior of the floor sound insulation systems under long-time load was quite different from that the systems under short-time load. The loading period increased the deflection of floor sound insulation systems and the increasing rate of the long-time deflection of the systems with ethylene vinyl acetate was smaller than that of the systems with low density ethylene polystyrene.

Keywords: resilient materials, floor sound insulation systems, long-time deflection, sustained load, noise pollution

Procedia PDF Downloads 268
18148 Automatic Identification and Classification of Contaminated Biodegradable Plastics using Machine Learning Algorithms and Hyperspectral Imaging Technology

Authors: Nutcha Taneepanichskul, Helen C. Hailes, Mark Miodownik

Abstract:

Plastic waste has emerged as a critical global environmental challenge, primarily driven by the prevalent use of conventional plastics derived from petrochemical refining and manufacturing processes in modern packaging. While these plastics serve vital functions, their persistence in the environment post-disposal poses significant threats to ecosystems. Addressing this issue necessitates approaches, one of which involves the development of biodegradable plastics designed to degrade under controlled conditions, such as industrial composting facilities. It is imperative to note that compostable plastics are engineered for degradation within specific environments and are not suited for uncontrolled settings, including natural landscapes and aquatic ecosystems. The full benefits of compostable packaging are realized when subjected to industrial composting, preventing environmental contamination and waste stream pollution. Therefore, effective sorting technologies are essential to enhance composting rates for these materials and diminish the risk of contaminating recycling streams. In this study, it leverage hyperspectral imaging technology (HSI) coupled with advanced machine learning algorithms to accurately identify various types of plastics, encompassing conventional variants like Polyethylene terephthalate (PET), Polypropylene (PP), Low density polyethylene (LDPE), High density polyethylene (HDPE) and biodegradable alternatives such as Polybutylene adipate terephthalate (PBAT), Polylactic acid (PLA), and Polyhydroxyalkanoates (PHA). The dataset is partitioned into three subsets: a training dataset comprising uncontaminated conventional and biodegradable plastics, a validation dataset encompassing contaminated plastics of both types, and a testing dataset featuring real-world packaging items in both pristine and contaminated states. Five distinct machine learning algorithms, namely Partial Least Squares Discriminant Analysis (PLS-DA), Support Vector Machine (SVM), Convolutional Neural Network (CNN), Logistic Regression, and Decision Tree Algorithm, were developed and evaluated for their classification performance. Remarkably, the Logistic Regression and CNN model exhibited the most promising outcomes, achieving a perfect accuracy rate of 100% for the training and validation datasets. Notably, the testing dataset yielded an accuracy exceeding 80%. The successful implementation of this sorting technology within recycling and composting facilities holds the potential to significantly elevate recycling and composting rates. As a result, the envisioned circular economy for plastics can be established, thereby offering a viable solution to mitigate plastic pollution.

Keywords: biodegradable plastics, sorting technology, hyperspectral imaging technology, machine learning algorithms

Procedia PDF Downloads 79
18147 Causal Estimation for the Left-Truncation Adjusted Time-Varying Covariates under the Semiparametric Transformation Models of a Survival Time

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

In biomedical researches and randomized clinical trials, the most commonly interested outcomes are time-to-event so-called survival data. The importance of robust models in this context is to compare the effect of randomly controlled experimental groups that have a sense of causality. Causal estimation is the scientific concept of comparing the pragmatic effect of treatments conditional to the given covariates rather than assessing the simple association of response and predictors. Hence, the causal effect based semiparametric transformation model was proposed to estimate the effect of treatment with the presence of possibly time-varying covariates. Due to its high flexibility and robustness, the semiparametric transformation model which shall be applied in this paper has been given much more attention for estimation of a causal effect in modeling left-truncated and right censored survival data. Despite its wide applications and popularity in estimating unknown parameters, the maximum likelihood estimation technique is quite complex and burdensome in estimating unknown parameters and unspecified transformation function in the presence of possibly time-varying covariates. Thus, to ease the complexity we proposed the modified estimating equations. After intuitive estimation procedures, the consistency and asymptotic properties of the estimators were derived and the characteristics of the estimators in the finite sample performance of the proposed model were illustrated via simulation studies and Stanford heart transplant real data example. To sum up the study, the bias of covariates was adjusted via estimating the density function for truncation variable which was also incorporated in the model as a covariate in order to relax the independence assumption of failure time and truncation time. Moreover, the expectation-maximization (EM) algorithm was described for the estimation of iterative unknown parameters and unspecified transformation function. In addition, the causal effect was derived by the ratio of the cumulative hazard function of active and passive experiments after adjusting for bias raised in the model due to the truncation variable.

Keywords: causal estimation, EM algorithm, semiparametric transformation models, time-to-event outcomes, time-varying covariate

Procedia PDF Downloads 125
18146 Preparation of Tempeh Spores Powder

Authors: Jaruwan Chutrtong, Tanakwan Bussabun

Abstract:

Study production of tempeh inoculums powder by freeze-drying comparison with dry at 50°C and the sun bask for developing efficient tempeh inoculums for tempeh producing. Rhizopus oligosporus in PDA slant cultures was incubated at 30°C for 3-5 days until spores and mycelium. Preparation spores suspension with sterilized water and then count the number of started spores. Fill spores suspension in Rice flour and soy flour, mixed with water (in the ratio 10: 7), which is steamed and sterilized at 121°C 15min. Incubated at room temperature for 4 days, count number of spores. Then take the progressive infection and full spore dough to dry at 50°C, sun bask, and lyophilize. Grind to powder. Then pack in plastic bags, stored at 5°C. To investigate quality of inoculums which use different methods, tempeh was fermented every 4 weeks for 24 weeks of the experiment. The result found that rice flour is not suitable to use as raw material in the production of powdered spores. Fungi can growth rarely. Less number of spores and requires more time than soy flour. For drying method, lyophilization is the least possible time. Samples from this method are very hard and very dark and harder to grind than other methods. Drying at 50°C takes longer time than lyophilization but can also set time use for drying. Character of the dry samples is hard solid and brown color, but can be grinded easier. The sun drying takes the longest time, can’t determine the exact time. When the spore powder was used to fermented tempeh immediately, product has similar characters as which use spores that was fresh prepared. The tempeh has normal quality. When spore powder stored at low temperature, tempeh from storage spore in weeks 4, 8 and 12 is still normal. Time spending in production was close to the production of fresh spores. After storage spores for 16 and 20 weeks, tempeh is still normal but growth and sporulation were take longer time than usual (about 6 hours). At 24 week storage, fungal growth is not good, made tempeh looks inferior to normal color, also smell and texture.

Keywords: freez drying, preparation, spores powder, tempeh

Procedia PDF Downloads 202
18145 Network Analysis of Genes Involved in the Biosynthesis of Medicinally Important Naphthodianthrone Derivatives of Hypericum perforatum

Authors: Nafiseh Noormohammadi, Ahmad Sobhani Najafabadi

Abstract:

Hypericins (hypericin and pseudohypericin) are natural napthodianthrone derivatives produced by Hypericum perforatum (St. John’s Wort), which have many medicinal properties such as antitumor, antineoplastic, antiviral, and antidepressant activities. Production and accumulation of hypericin in the plant are influenced by both genetic and environmental conditions. Despite the existence of different high-throughput data on the plant, genetic dimensions of hypericin biosynthesis have not yet been completely understood. In this research, 21 high-quality RNA-seq data on different parts of the plant were integrated into metabolic data to reconstruct a coexpression network. Results showed that a cluster of 30 transcripts was correlated with total hypericin. The identified transcripts were divided into three main groups based on their functions, including hypericin biosynthesis genes, transporters, detoxification genes, and transcription factors (TFs). In the biosynthetic group, different isoforms of polyketide synthase (PKSs) and phenolic oxidative coupling proteins (POCPs) were identified. Phylogenetic analysis of protein sequences integrated into gene expression analysis showed that some of the POCPs seem to be very important in the biosynthetic pathway of hypericin. In the TFs group, six TFs were correlated with total hypericin. qPCR analysis of these six TFs confirmed that three of them were highly correlated. The identified genes in this research are a rich resource for further studies on the molecular breeding of H. perforatum in order to obtain varieties with high hypericin production.

Keywords: hypericin, St. John’s Wort, data mining, transcription factors, secondary metabolites

Procedia PDF Downloads 93
18144 Multistage Data Envelopment Analysis Model for Malmquist Productivity Index Using Grey's System Theory to Evaluate Performance of Electric Power Supply Chain in Iran

Authors: Mesbaholdin Salami, Farzad Movahedi Sobhani, Mohammad Sadegh Ghazizadeh

Abstract:

Evaluation of organizational performance is among the most important measures that help organizations and entities continuously improve their efficiency. Organizations can use the existing data and results from the comparison of units under investigation to obtain an estimation of their performance. The Malmquist Productivity Index (MPI) is an important index in the evaluation of overall productivity, which considers technological developments and technical efficiency at the same time. This article proposed a model based on the multistage MPI, considering limited data (Grey’s theory). This model can evaluate the performance of units using limited and uncertain data in a multistage process. It was applied by the electricity market manager to Iran’s electric power supply chain (EPSC), which contains uncertain data, to evaluate the performance of its actors. Results from solving the model showed an improvement in the accuracy of future performance of the units under investigation, using the Grey’s system theory. This model can be used in all case studies, in which MPI is used and there are limited or uncertain data.

Keywords: Malmquist Index, Grey's Theory, CCR Model, network data envelopment analysis, Iran electricity power chain

Procedia PDF Downloads 165
18143 Navigating Neural Pathways to Success with Students on the Autism Spectrum

Authors: Panda Krouse

Abstract:

This work is a marriage of the science of Applied Behavioral Analysis and an educator’s look at Neuroscience. The focus is integrating what we know about the anatomy of the brain in autism and evidence-based practices in education. It is a bold attempt to present links between neurological research and the application of evidence-based practices in education. In researching for this work, no discovery of articles making these connections was made. Consideration of the areas of structural differences in the brain are aligned with evidence-based strategies. A brief literary review identifies how identified areas affect overt behavior, which is what, as educators, is what we can see and measure. Giving further justification and validation of our practices in education from a second scientific field is significant for continued improvement in intervention for students on the autism spectrum.

Keywords: autism, evidence based practices, neurological differences, education intervention

Procedia PDF Downloads 67
18142 Unconventional Hydrocarbon Management Strategy

Authors: Edi Artono, Budi Tamtomo, Gema Wahyudi Purnama

Abstract:

The world energy demand increasing extreamly high time by time, including domestic demand. That is impossible to avoid because energy a country demand proportional to surge in the number of residents, economic growth and addition of industrial sector. Domestic Oil and gas conventional reserves depleted naturally while production outcome from reservoir also depleted time to time. In the other hand, new reserve did not discover significantly to replace it all. Many people are investigating to looking for new alternative energy to answer the challenge. There are several option to solve energy fossil needed problem using Unconventional Hydrocarbon. There are four aspects to consider as a management reference in order that Unconventional Hydrocarbon business can work properly, divided to: 1. Legal aspect, 2. Environmental aspect, 3. Technical aspect and 4. Economy aspect. The economic aspect as the key to whether or not a project can be implemented or not in Oil and Gas business scheme, so do Unconventional Hydorcarbon business scheme. The support of regulation are needed to buttress Unconventional Hydorcarbon business grow up and make benefits contribute to Government.

Keywords: alternative energy, unconventional hydrocarbon, regulation support, management strategy

Procedia PDF Downloads 350
18141 Poisoning in Morocco: Evolution and Risk Factors

Authors: El Khaddam Safaa, Soulaymani Abdelmajid, Mokhtari Abdelghani, Ouammi Lahcen, Rachida Soulaymani-Beincheikh

Abstract:

The poisonings represent a problem of health in the world and Morocco, The exact dimensions of this phenomenon are still poorly recorded that we see the lack of exhaustive statistical data. The objective of this retrospective study of a series of cases of the poisonings declared at the level of the region of Tadla-Azilal and collected by the Moroccan Poison Control and Pharmacovigilance Center. An epidemiological profile of the poisonings was to raise, to determine the risk factors influencing the vital preview of the poisoned And to follow the evolution of the incidence, the lethality, and the mortality. During the period of study, we collected and analyzed 9303 cases of poisonings by different incriminated toxic products with the exception of the scorpion poisonings. These poisonings drove to 99 deaths. The epidemiological profile which we raised, showed that the poisoned were of any age with an average of 24.62±16.61 years, The sex-ratio (woman/man) was 1.36 in favor of the women. The difference between both sexes is highly significant (χ2 = 210.5; p<0,001). Most of the poisoned which declared to be of urban origin (60.5 %) (χ2=210.5; p<0,001). Carbon monoxide was the most incriminated among the cases of poisonings (24.15 %), them putting in head, followed by some pesticides and farm produces (21.44 %) and food (19.95 %). The analysis of the risk factors showed that the grown-up patients whose age is between 20 and 74 years have twice more risk of evolving towards the death (RR=1,57; IC95 % = 1,03-2,38) than the other age brackets, so the male genital organ was the most exposed (explained) to the death that the female genital organ (RR=1,59; IC95 % = 1,07-2,38) The patients of rural origin had presented 5 times more risk (RR=4,713; IC95 % = 2,543-8,742). Poisoned by the mineral products had presented the maximum of risk on the vital preview death (RR=23,19, IC95 % = 2,39-224,1). The poisonings by pesticides produce a risk of 9 (RR=9,31; IC95 % = 6,10-14,18). The incidence was 3,3 cases of 10000 inhabitants, and the mortality was 0,004 cases of 1000 inhabitants (that is 4 cases by 1000 000 inhabitants). The rate of lethality registered annually was 10.6 %. The evolution of the indicators of health according to the years showed that the rate of statement measured by the incidence increased by a significant way. We also noted an improvement in the coverage which (who) ended up with a decrease in the rate of the lethality and the mortality during last years. The fight anti-toxic is a work of length time. He asks for a lot of work various levels. It is necessary to attack the delay accumulated by our country on the various legal, institutional and technical aspects. The ideal solution is to develop and to set up a national strategy.

Keywords: epidemiology, poisoning, risk factors, indicators of health, Tadla-Azilal grated by anti-toxic fight

Procedia PDF Downloads 365
18140 FACTS Based Stabilization for Smart Grid Applications

Authors: Adel. M. Sharaf, Foad H. Gandoman

Abstract:

Nowadays, Photovoltaic-PV Farms/ Parks and large PV-Smart Grid Interface Schemes are emerging and commonly utilized in Renewable Energy distributed generation. However, PV-hybrid-Dc-Ac Schemes using interface power electronic converters usually has negative impact on power quality and stabilization of modern electrical network under load excursions and network fault conditions in smart grid. Consequently, robust FACTS based interface schemes are required to ensure efficient energy utilization and stabilization of bus voltages as well as limiting switching/fault onrush current condition. FACTS devices are also used in smart grid-Battery Interface and Storage Schemes with PV-Battery Storage hybrid systems as an elegant alternative to renewable energy utilization with backup battery storage for electric utility energy and demand side management to provide needed energy and power capacity under heavy load conditions. The paper presents a robust interface PV-Li-Ion Battery Storage Interface Scheme for Distribution/Utilization Low Voltage Interface using FACTS stabilization enhancement and dynamic maximum PV power tracking controllers. Digital simulation and validation of the proposed scheme is done using MATLAB/Simulink software environment for Low Voltage- Distribution/Utilization system feeding a hybrid Linear-Motorized inrush and nonlinear type loads from a DC-AC Interface VSC-6-pulse Inverter Fed from the PV Park/Farm with a back-up Li-Ion Storage Battery.

Keywords: AC FACTS, smart grid, stabilization, PV-battery storage, Switched Filter-Compensation (SFC)

Procedia PDF Downloads 412
18139 The Role of Tax Management Components in Creating Value or Increasing Risk of Tehran Stock Exchange Firms

Authors: Fereshteh Darash

Abstract:

Reflective tax management corresponds to the Agency Theory since it determines the motivation of managers for tax management actions and short-term and long-term consequences. Therefore, selection of tax strategy contributes to the tax and financial position of the firm in the future. The aim of the present research is to evaluate the effect of tax management components on risk-taking of firms listed in Tehran stock exchange by using regression analysis method. Results show that tax effective rate, tax risk and tax planning have no significant effect on the firm's future risk. Results suggest that stakeholders assess the effective tax rate and delay in tax payment in line with their benefits. They tend to accept the higher risk cost for reduction of tax payments and benefits of higher liquidity in current period. Hence, effective tax rate and tax risk have no significant effect on future risk of the firm. Moreover, tax planning yields no information regarding the predictability of the future profits and as a result, it has no significant effect on the future risk of the firm since specific goals of financial reporting are in priority for the stakeholders and regardless of the firm’s data analysis, they take investment decisions and they less intend to purchase the stocks in a rational manner.

Keywords: tax management, tax effective rate, tax risk, tax planning, firm risk

Procedia PDF Downloads 136
18138 Improving Fingerprinting-Based Localization System Using Generative AI

Authors: Getaneh Berie Tarekegn, Li-Chia Tai

Abstract:

With the rapid advancement of artificial intelligence, low-power built-in sensors on Internet of Things devices, and communication technologies, location-aware services have become increasingly popular and have permeated every aspect of people’s lives. Global navigation satellite systems (GNSSs) are the default method of providing continuous positioning services for ground and aerial vehicles, as well as consumer devices (smartphones, watches, notepads, etc.). However, the environment affects satellite positioning systems, particularly indoors, in dense urban and suburban cities enclosed by skyscrapers, or when deep shadows obscure satellite signals. This is because (1) indoor environments are more complicated due to the presence of many objects surrounding them; (2) reflection within the building is highly dependent on the surrounding environment, including the positions of objects and human activity; and (3) satellite signals cannot be reached in an indoor environment, and GNSS doesn't have enough power to penetrate building walls. GPS is also highly power-hungry, which poses a severe challenge for battery-powered IoT devices. Due to these challenges, IoT applications are limited. Consequently, precise, seamless, and ubiquitous Positioning, Navigation and Timing (PNT) systems are crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarms, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.

Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine

Procedia PDF Downloads 42
18137 Spatial Cognition and 3-Dimensional Vertical Urban Design Guidelines

Authors: Hee Sun (Sunny) Choi, Gerhard Bruyns, Wang Zhang, Sky Cheng, Saijal Sharma

Abstract:

The main focus of this paper is to propose a comprehensive framework for the cognitive measurement and modelling of the built environment. This will involve exploring and measuring neural mechanisms. The aim is to create a foundation for further studies in this field that are consistent and rigorous. Additionally, this framework will facilitate collaboration with cognitive neuroscientists by establishing a shared conceptual basis. The goal of this research is to develop a human-centric approach for urban design that is scientific and measurable, producing a set of urban design guidelines that incorporate cognitive measurement and modelling. By doing so, the broader intention is to design urban spaces that prioritize human needs and well-being, making them more liveable.

Keywords: vertical urbanism, human centric design, spatial cognition and psychology, vertical urban design guidelines

Procedia PDF Downloads 83
18136 Assessment of Susceptibility of the Poultry Red Mite, Dermanyssus gallinae (Acari: Dermanyssidae) to Some Plant Preparations with Focus on Exposure Time

Authors: Shahrokh Ranjbar-Bahadori, Nima Farhadifar, Leila Mohammadyar

Abstract:

Plant preparations from thyme and garlic have been shown to be effective acaricides against the poultry red mite, Dermanyssus gallinae. In a layer house with a history of D. gallinae problem, mites were detected in the monitoring traps for the first time and number of them was counted. Then, some rows of layer house was sprayed twice using a concentration of 0.21 mg/cm2 thyme essential oil and 0.07 mg/cm2 garlic juice and a similar row was used as an untreated control group. Red mite traps made of cardboard were used to assess the mite density during days 1 and 7 after treatment and always removed after 24 h. the collected mites were counted and the efficacy against all mite stages (larvae, nymphs and adults) was calculated. Results showed that on day 1 and 7 after the administration of garlic extract efficacy rate was 92.05% and 74.62%, respectively. Moreover, efficacy rate on day 1 and 7 was 89.4% and 95.37% when treatment was done with thyme essential oil. It is concluded that using garlic juice to control of D. gallinae is more effective on short time. But thyme essential oil has a long time effect in compare to garlic preparation.

Keywords: Dermanyssus gallinae, essential oil, garlic, thyme, efficacy

Procedia PDF Downloads 435
18135 A Hybrid Traffic Model for Smoothing Traffic Near Merges

Authors: Shiri Elisheva Decktor, Sharon Hornstein

Abstract:

Highway merges and unmarked junctions are key components in any urban road network, which can act as bottlenecks and create traffic disruption. Inefficient highway merges may trigger traffic instabilities such as stop-and-go waves, pose safety conditions and lead to longer journey times. These phenomena occur spontaneously if the average vehicle density exceeds a certain critical value. This study focuses on modeling the traffic using a microscopic traffic flow model. A hybrid traffic model, which combines human-driven and controlled vehicles is assumed. The controlled vehicles obey different driving policies when approaching the merge, or in the vicinity of other vehicles. We developed a co-simulation model in SUMO (Simulation of Urban Mobility), in which the human-driven cars are modeled using the IDM model, and the controlled cars are modeled using a dedicated controller. The scenario chosen for this study is a closed track with one merge and one exit, which could be later implemented using a scaled infrastructure on our lab setup. This will enable us to benchmark the results of this study obtained in simulation, to comparable results in similar conditions in the lab. The metrics chosen for the comparison of the performance of our algorithm on the overall traffic conditions include the average speed, wait time near the merge, and throughput after the merge, measured under different travel demand conditions (low, medium, and heavy traffic).

Keywords: highway merges, traffic modeling, SUMO, driving policy

Procedia PDF Downloads 106
18134 Campaigns of Youth Empowerment and Unemployment In Development Discourses: In the Case of Ethiopia

Authors: Fentie, Belay, Mulat

Abstract:

In today’s high decrement figure of the global economy, nations are facing many economic, social and political challenges; universally, there is high distress of food and other survival insecurity. Further, as a result of conflict, natural disasters, and leadership influences, youths are existentially less empowered and unemployed, especially in developing countries. With this situation to handle well challenges, it’s important to search, investigate and deliberate about youth, unemployment, empowerment and possible management fashions, as youths have the potential to carry and fight such battles. The method adopted is a qualitative analysis of secondary data sources in youth empowerment, unemployment and development as an inclusive framework. Youth unemployment is a major development headache for most African countries. In Ethiopia, following weak youth empowerment, youth unemployment has increased from time to time, and quality education and organization linkage matter as an important constraint. As a management challenge, although accessibility of quality education for Ethiopian youths is an important constraint, the country's youths are fortified deceptively and harassed in a vicious political challenge in their struggle to fetch social and economic changes in the country. Further, thousands of youths are inactivated, criminalized and lost their lives and this makes youths hopeless anger in their lives and pushes them further to be exposed for addictions, prostitution, violence, and illegitimate migrations. This youth challenge wasn’t only destined for African countries; rather, indeed, it was a global burden and headed as a global agenda. As a resolution, the construction of a healthy education system can create independent youths who acquire success and accelerate development. Developing countries should ensue development in the cultivation of empowerment tools through long and short-term education, implementing policy in action, diminishing wide-ranging gaps of (religion, ethnicity & region), and take high youth population as an opportunity and empower them. Further managing and empowering youths to be involved in decision-making, giving political weight and building a network of organizations to easily access job opportunities are important suggestions to save youths in work, for both increasing their income and the country's food security balance.

Keywords: development, Ethiopia, management, unemployment, youth empowerment

Procedia PDF Downloads 59