Search results for: continuous data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26753

Search results for: continuous data

22703 High Temperature Tolerance of Chironomus Sulfurosus and Its Molecular Mechanisms

Authors: Tettey Afi Pamela, Sotaro Fujii, Hidetoshi Saito, Kawaii Koichiro

Abstract:

Introduction: Organisms employ adaptive mechanisms when faced with any stressor or risk of being wiped out. This has made it possible for them to survive in harsh environmental conditions such as increasing temperature, low pH, and anoxia. Some of the mechanisms they utilize include the expression of heat shock proteins, synthesis of cryoprotectants, and anhydrobiosis. Heat shock proteins (HSPs) have been widely studied to determine their involvement in stress tolerance among various organism, of which chironomid species have been no exception. We examined the survival and expression of genes encoding five (5) heat shock proteins (HSP70, HSP67, HSP60, HSP27, and HSP23) from Chironomus sulfurosus larvae reared from 1st instar at 25°C, 30°C, 35°C, and 40°C. Results: The highest survival rate was recorded at 30°C, followed by 25°C, then 35°C. Only a small percentage of C. sulfurosus survived at 40°C (14.5%). With regards to HSPs expression, some HSPs responded to an increase in high temperature. The relative expression levels were lowest at 30°C for HSP70, HSP60, HSP27, and HSP23. At 25°C and 40°C, HSP70, HSP67, HSP60, HSP27, and HSP23 had the highest expression. At 35°C, all had the lowest expression. Discussion: The expression of heat shock proteins varies from one species to another. We designated the genes HSP 70, HSP 67, HSP 60, HSP 27, and HSP 23 genes based on transcriptome analysis of C. sulfurosus. Our study can be termed as a long-heat shock study as C. sulfurosus was reared from the first instar to the fourth instar, and this might have led to a continuous induction of HSPs at 25°C. 40°C had the lowest survival but highest HSPs expression as C. sulfurosus larvae had to utilize HSPs for sustenance. These results and future high-throughput studies at both the transcriptome and proteome level will improve the information needed to predict the future geographic distribution of these species within the context of global warming.

Keywords: chironomid, heat shock proteins, high temperature, heat shock protein expression

Procedia PDF Downloads 95
22702 Investigating the Relationship between Growth, Beta and Liquidity

Authors: Zahra Amirhosseini, Mahtab Nameni

Abstract:

The aim of this study was to investigate the relationship between growth, beta, and Company's cash. We calculate cash as dependent variable and growth opportunity and beta as independent variables. This study was based on an analysis of panel data. Population of the study is the companies which listed in Tehran Stock exchange and a financial data of 215 companies during the period 2010 to 2015 have been selected as the sample through systematic sampling. The results of the first hypothesis showed there is a significant relationship between growth opportunities cash holdings. Also according to the analysis done in the second hypothesis, we determined that there is an inverse relation between company risk and cash holdings.

Keywords: growth, beta, liquidity, company

Procedia PDF Downloads 395
22701 Social Media Mining with R. Twitter Analyses

Authors: Diana Codat

Abstract:

Tweets' analysis is part of text mining. Each document is a written text. It's possible to apply the usual text search techniques, in particular by switching to the bag-of-words representation. But the tweets induce peculiarities. Some may enrich the analysis. Thus, their length is calibrated (at least as far as public messages are concerned), special characters make it possible to identify authors (@) and themes (#), the tweet and retweet mechanisms make it possible to follow the diffusion of the information. Conversely, other characteristics may disrupt the analyzes. Because space is limited, authors often use abbreviations, emoticons to express feelings, and they do not pay much attention to spelling. All this creates noise that can complicate the task. The tweets carry a lot of potentially interesting information. Their exploitation is one of the main axes of the analysis of the social networks. We show how to access Twitter-related messages. We will initiate a study of the properties of the tweets, and we will follow up on the exploitation of the content of the messages. We will work under R with the package 'twitteR'. The study of tweets is a strong focus of analysis of social networks because Twitter has become an important vector of communication. This example shows that it is easy to initiate an analysis from data extracted directly online. The data preparation phase is of great importance.

Keywords: data mining, language R, social networks, Twitter

Procedia PDF Downloads 184
22700 Study on Seismic Assessment of Earthquake-Damaged Reinforced Concrete Buildings

Authors: Fu-Pei Hsiao, Fung-Chung Tu, Chien-Kuo Chiu

Abstract:

In this work, to develop a method for detailed assesses of post-earthquake seismic performance for RC buildings in Taiwan, experimental data for several column specimens with various failure modes (flexural failure, flexural-shear failure, and shear failure) are used to derive reduction factors of seismic capacity for specified damage states. According to the damage states of RC columns and their corresponding seismic reduction factors suggested by experimental data, this work applies the detailed seismic performance assessment method to identify the seismic capacity of earthquake-damaged RC buildings. Additionally, a post-earthquake emergent assessment procedure is proposed that can provide the data needed for decision about earthquake-damaged buildings in a region with high seismic hazard. Finally, three actual earthquake-damaged school buildings in Taiwan are used as a case study to demonstrate application of the proposed assessment method.

Keywords: seismic assessment, seismic reduction factor, residual seismic ratio, post-earthquake, reinforced concrete, building

Procedia PDF Downloads 400
22699 Deep Learning for Recommender System: Principles, Methods and Evaluation

Authors: Basiliyos Tilahun Betru, Charles Awono Onana, Bernabe Batchakui

Abstract:

Recommender systems have become increasingly popular in recent years, and are utilized in numerous areas. Nowadays many web services provide several information for users and recommender systems have been developed as critical element of these web applications to predict choice of preference and provide significant recommendations. With the help of the advantage of deep learning in modeling different types of data and due to the dynamic change of user preference, building a deep model can better understand users demand and further improve quality of recommendation. In this paper, deep neural network models for recommender system are evaluated. Most of deep neural network models in recommender system focus on the classical collaborative filtering user-item setting. Deep learning models demonstrated high level features of complex data can be learned instead of using metadata which can significantly improve accuracy of recommendation. Even though deep learning poses a great impact in various areas, applying the model to a recommender system have not been fully exploited and still a lot of improvements can be done both in collaborative and content-based approach while considering different contextual factors.

Keywords: big data, decision making, deep learning, recommender system

Procedia PDF Downloads 479
22698 Performance Analysis of Scalable Secure Multicasting in Social Networking

Authors: R. Venkatesan, A. Sabari

Abstract:

Developments of social networking internet scenario are recommended for the requirements of scalable, authentic, secure group communication model like multicasting. Multicasting is an inter network service that offers efficient delivery of data from a source to multiple destinations. Even though multicast has been very successful at providing an efficient and best-effort data delivery service for huge groups, it verified complex process to expand other features to multicast in a scalable way. Separately, the requirement for secure electronic information had become gradually more apparent. Since multicast applications are deployed for mainstream purpose the need to secure multicast communications will become significant.

Keywords: multicasting, scalability, security, social network

Procedia PDF Downloads 292
22697 Vulnerability Risk Assessment of Non-Engineered Houses Based on Damage Data of the 2009 Padang Earthquake 2009 in Padang City, Indonesia

Authors: Rusnardi Rahmat Putra, Junji Kiyono, Aiko Furukawa

Abstract:

Several powerful earthquakes have struck Padang during recent years, one of the largest of which was an M 7.6 event that occurred on September 30, 2009 and caused more than 1000 casualties. Following the event, we conducted a 12-site microtremor array investigation to gain a representative determination of the soil condition of subsurface structures in Padang. From the dispersion curve of array observations, the central business district of Padang corresponds to relatively soft soil condition with Vs30 less than 400 m/s. because only one accelerometer existed, we simulated the 2009 Padang earthquake to obtain peak ground acceleration for all sites in Padang city. By considering the damage data of the 2009 Padang earthquake, we produced seismic risk vulnerability estimation of non-engineered houses for rock, medium and soft soil condition. We estimated the loss ratio based on the ground response, seismic hazard of Padang and the existing damaged to non-engineered structure houses due to Padang earthquake in 2009 data for several return periods of earthquake events.

Keywords: profile, Padang earthquake, microtremor array, seismic vulnerability

Procedia PDF Downloads 410
22696 Screening of Rice Genotypes in Methane and Carbon Dioxide Emissions Under Different Water Regimes

Authors: Mthiyane Pretty, Mitsui Toshiake, Nagano Hirohiko, Aycan Murat

Abstract:

Among the most significant greenhouse gases released from rice fields are methane and carbon dioxide. The primary focus of this research was to quantify CH₄ and CO₂ gas using different 4 rice cultivars, two water regimes, and a recording of soil moisture and temperature. In this study, we hypothesized that paddy field soils may directly affect soil enzymatic activities and physicochemical properties in the rhizosphere soil of paddy fields and subsequently indirectly affect the activity, abundance, diversity, and community composition of methanogens, ultimately affecting CH₄ flux. The experiment was laid out in the randomized block design with two treatments and three replications for each genotype. In two treatments, paddy fields and artificial soil were used. 35 days after planting (DAP), continuous flooding irrigation, Alternate wetting, and drying (AWD) were applied during the vegetative stage. The highest recorded measurements of soil and environmental parameters were soil moisture at 76%, soil temperature at 28.3℃, Bulk EC at 0.99 ds/m, and pore water EC at 1,25, using HydraGO portable soil sensor system. Gas samples were carried out once on a weekly basis at 09:00 am and 12: 00 pm to obtain the mean GHG flux. Gas Chromatography (GC, Shimadzu, GC-2010, Japan) was used for the analysis of CH4 and CO₂. The treatments with paddy field soil had a 1.3℃ higher temperature than artificial soil. The overall changes in Bulk EC were not significant across the treatment. The CH₄ emission patterns were observed in all rice genotypes, although they were less in treatments with AWD with artificial soil. This shows that AWD creates oxic conditions in the rice soil. CO₂ was also quantified, but it was in minute quantities, as rice plants were using CO₂ for photosynthesis. The highest tillering number was 7, and the lowest was 3 in cultivars grown. The rice varieties to be used for breeding are Norin 24, with showed a high number of tillers with less CH₄.

Keywords: greenhouse gases, methane, morphological characterization, alternating wetting and drying

Procedia PDF Downloads 80
22695 Hybrid Data-Driven Drilling Rate of Penetration Optimization Scheme Guided by Geological Formation and Historical Data

Authors: Ammar Alali, Mahmoud Abughaban, William Contreras Otalvora

Abstract:

Optimizing the drilling process for cost and efficiency requires the optimization of the rate of penetration (ROP). ROP is the measurement of the speed at which the wellbore is created, in units of feet per hour. It is the primary indicator of measuring drilling efficiency. Maximization of the ROP can indicate fast and cost-efficient drilling operations; however, high ROPs may induce unintended events, which may lead to nonproductive time (NPT) and higher net costs. The proposed ROP optimization solution is a hybrid, data-driven system that aims to improve the drilling process, maximize the ROP, and minimize NPT. The system consists of two phases: (1) utilizing existing geological and drilling data to train the model prior, and (2) real-time adjustments of the controllable dynamic drilling parameters [weight on bit (WOB), rotary speed (RPM), and pump flow rate (GPM)] that direct influence on the ROP. During the first phase of the system, geological and historical drilling data are aggregated. After, the top-rated wells, as a function of high instance ROP, are distinguished. Those wells are filtered based on NPT incidents, and a cross-plot is generated for the controllable dynamic drilling parameters per ROP value. Subsequently, the parameter values (WOB, GPM, RPM) are calculated as a conditioned mean based on physical distance, following Inverse Distance Weighting (IDW) interpolation methodology. The first phase is concluded by producing a model of drilling best practices from the offset wells, prioritizing the optimum ROP value. This phase is performed before the commencing of drilling. Starting with the model produced in phase one, the second phase runs an automated drill-off test, delivering live adjustments in real-time. Those adjustments are made by directing the driller to deviate two of the controllable parameters (WOB and RPM) by a small percentage (0-5%), following the Constrained Random Search (CRS) methodology. These minor incremental variations will reveal new drilling conditions, not explored before through offset wells. The data is then consolidated into a heat-map, as a function of ROP. A more optimum ROP performance is identified through the heat-map and amended in the model. The validation process involved the selection of a planned well in an onshore oil field with hundreds of offset wells. The first phase model was built by utilizing the data points from the top-performing historical wells (20 wells). The model allows drillers to enhance decision-making by leveraging existing data and blending it with live data in real-time. An empirical relationship between controllable dynamic parameters and ROP was derived using Artificial Neural Networks (ANN). The adjustments resulted in improved ROP efficiency by over 20%, translating to at least 10% saving in drilling costs. The novelty of the proposed system lays is its ability to integrate historical data, calibrate based geological formations, and run real-time global optimization through CRS. Those factors position the system to work for any newly drilled well in a developing field event.

Keywords: drilling optimization, geological formations, machine learning, rate of penetration

Procedia PDF Downloads 131
22694 Evaluation of a Method for the Virtual Design of a Software-based Approach for Electronic Fuse Protection in Automotive Applications

Authors: Dominic Huschke, Rudolf Keil

Abstract:

New driving functionalities like highly automated driving have a major impact on the electrics/electronics architecture of future vehicles and inevitably lead to higher safety requirements. Partly due to these increased requirements, the vehicle industry is increasingly looking at semiconductor switches as an alternative to conventional melting fuses. The protective functionality of semiconductor switches can be implemented in hardware as well as in software. A current approach discussed in science and industry is the implementation of a model of the protected low voltage power cable on a microcontroller to calculate its temperature. Here, the information regarding the current is provided by the continuous current measurement of the semiconductor switch. The signal to open the semiconductor switch is provided by the microcontroller when a previously defined limit for the temperature of the low voltage power cable is exceeded. A setup for the testing of the described principle for electronic fuse protection of a low voltage power cable is built and successfullyvalidated with experiments afterwards. Here, the evaluation criterion is the deviation of the measured temperature of the low voltage power cable from the specified limit temperature when the semiconductor switch is opened. The analysis is carried out with an assumed ambient temperature as well as with a measured ambient temperature. Subsequently, the experimentally performed investigations are simulated in a virtual environment. The explicit focus is on the simulation of the behavior of the microcontroller with an implemented model of a low voltage power cable in a real-time environment. Subsequently, the generated results are compared with those of the experiments. Based on this, the completely virtual design of the described approach is assumed to be valid.

Keywords: automotive wire harness, electronic fuse protection, low voltage power cable, semiconductor-based fuses, software-based validation

Procedia PDF Downloads 105
22693 Disagreement in Spousal Report of Current Contraceptive Use in India and Its Determinants

Authors: Dipti Govil, Nidhi Khosla

Abstract:

Couple-level reports of contraception are important as wives and husbands may give different reports about contraceptive use. Using matched couple-data (N=62910), from India's NFHS–IV (2015-16), this paper examines concordance in spousal reports of current contraceptive use and its differentials. Reporting of contraceptive use was higher among wives (59%) than husbands (25%). Concordance was low; 16.5% of couples reported the use of the same method, while 21% reported the use of any method. There existed a huge denial from husbands on the use of female sterilization. Reconstruction of contraceptive use among men increased concordance by 10%. Multivariate analysis shows that concordance was low in urban and Southern India, among younger women and women with lower wealth-index. Men's control over household decision-making and negative attitudes towards contraception were associated with a lower concordance. Findings highlight the importance of using couple-level data to estimate contraceptive prevalence, the role of education programs to inculcate positive attitudes towards contraception, fostering gender equality, and involving men into family planning efforts. The results also raise the issue of data quality as the questions were asked differently from men and women, which might have contributed to wide discordance.

Keywords: concordance, contraceptive use, couple, female sterilisation, India

Procedia PDF Downloads 129
22692 Cloud Computing in Jordanian Libraries: An Overview

Authors: Mohammad A. Al-Madi, Nagham A. Al-Madi, Fanan A. Al-Madi

Abstract:

The current concept of the technology of cloud computing libraries has been increasing where users can store their data in a virtual space and can be retrieved from anywhere whilst using the network. By using cloud computing technology, industries and individuals save money, time, and space. Moreover, data and information about libraries can be placed in the cloud. This paper discusses the meaning of cloud computing along with its types. Further, the focus has been given to the application of cloud computing in modern libraries. Additionally, the advantages of cloud computing and the areas in which cloud computing be applied with current usage are discussed. Finally, the present situation of the Jordanian libraries is considered and discussed in further detail.

Keywords: cloud computing, community cloud, hybrid cloud, private cloud, public cloud

Procedia PDF Downloads 221
22691 A New Approach to Achieve the Regime Equations in Sand-Bed Rivers

Authors: Farhad Imanshoar

Abstract:

The regime or equilibrium geometry of alluvial rivers remains a topic of fundamental scientific and engineering interest. There are several approaches to analyze the problem, namely: empirical formulas, semi-theoretical methods and rational (extreme) procedures. However, none of them is widely accepted at present, due to lack of knowledge of some physical processes associated with channel formation and the simplification hypotheses imposed in order to reduce the high quantity of involved variables. The study presented in this paper shows a new approach to estimate stable width and depth of sand-bed rivers by using developed stream power equation (DSPE). At first, a new procedure based on theoretical analysis and by considering DSPE and ultimate sediment concentration were developed. Then, experimental data for regime condition in sand-bed rivers (flow depth, flow width, sediment feed rate for several cases) were gathered. Finally, the results of this research (regime equations) are compared with the field data and other regime equations. A good agreement was observed between the field data and the values resulted from developed regime equation.

Keywords: regime equations, developed stream power equation, sand-bed rivers, semi-theoretical methods

Procedia PDF Downloads 268
22690 RFID Logistic Management with Cold Chain Monitoring: Cold Store Case Study

Authors: Mira Trebar

Abstract:

Logistics processes of perishable food in the supply chain include the distribution activities and the real time temperature monitoring to fulfil the cold chain requirements. The paper presents the use of RFID (Radio Frequency Identification) technology as an identification tool of receiving and shipping activities in the cold store. At the same time, the use of RFID data loggers with temperature sensors is presented to observe and store the temperatures for the purpose of analyzing the processes and having the history data available for traceability purposes and efficient recall management.

Keywords: logistics, warehouse, RFID device, cold chain

Procedia PDF Downloads 631
22689 Assessing of Social Comfort of the Russian Population with Big Data

Authors: Marina Shakleina, Konstantin Shaklein, Stanislav Yakiro

Abstract:

The digitalization of modern human life over the last decade has facilitated the acquisition, storage, and processing of data, which are used to detect changes in consumer preferences and to improve the internal efficiency of the production process. This emerging trend has attracted academic interest in the use of big data in research. The study focuses on modeling the social comfort of the Russian population for the period 2010-2021 using big data. Big data provides enormous opportunities for understanding human interactions at the scale of society with plenty of space and time dynamics. One of the most popular big data sources is Google Trends. The methodology for assessing social comfort using big data involves several steps: 1. 574 words were selected based on the Harvard IV-4 Dictionary adjusted to fit the reality of everyday Russian life. The set of keywords was further cleansed by excluding queries consisting of verbs and words with several lexical meanings. 2. Search queries were processed to ensure comparability of results: the transformation of data to a 10-point scale, elimination of popularity peaks, detrending, and deseasoning. The proposed methodology for keyword search and Google Trends processing was implemented in the form of a script in the Python programming language. 3. Block and summary integral indicators of social comfort were constructed using the first modified principal component resulting in weighting coefficients values of block components. According to the study, social comfort is described by 12 blocks: ‘health’, ‘education’, ‘social support’, ‘financial situation’, ‘employment’, ‘housing’, ‘ethical norms’, ‘security’, ‘political stability’, ‘leisure’, ‘environment’, ‘infrastructure’. According to the model, the summary integral indicator increased by 54% and was 4.631 points; the average annual rate was 3.6%, which is higher than the rate of economic growth by 2.7 p.p. The value of the indicator describing social comfort in Russia is determined by 26% by ‘social support’, 24% by ‘education’, 12% by ‘infrastructure’, 10% by ‘leisure’, and the remaining 28% by others. Among 25% of the most popular searches, 85% are of negative nature and are mainly related to the blocks ‘security’, ‘political stability’, ‘health’, for example, ‘crime rate’, ‘vulnerability’. Among the 25% most unpopular queries, 99% of the queries were positive and mostly related to the blocks ‘ethical norms’, ‘education’, ‘employment’, for example, ‘social package’, ‘recycling’. In conclusion, the introduction of the latent category ‘social comfort’ into the scientific vocabulary deepens the theory of the quality of life of the population in terms of the study of the involvement of an individual in the society and expanding the subjective aspect of the measurements of various indicators. Integral assessment of social comfort demonstrates the overall picture of the development of the phenomenon over time and space and quantitatively evaluates ongoing socio-economic policy. The application of big data in the assessment of latent categories gives stable results, which opens up possibilities for their practical implementation.

Keywords: big data, Google trends, integral indicator, social comfort

Procedia PDF Downloads 201
22688 Scientific Expedition to Understand the Crucial Issues of Rapid Lake Expansion and Moraine Dam Instability Phenomena to Justify the Lake Lowering Effort of Imja Lake, Khumbu Region of Sagarmatha, Nepal

Authors: R. C. Tiwari, N. P. Bhandary, D. B. Thapa Chhetri, R. Yatabe

Abstract:

The research enlightens the various issues of lake expansion and stability of the moraine dam of Imja lake. The Imja lake considered that the world highest altitude lake (5010m from m.s.l.), located in the Khumbu, Sagarmatha region of Nepal (27.90 N and 86.90 E) was reported as one of the fast growing glacier lakes in the Nepal Himalaya. The research explores a common phenomenon of lake expansion and stability issues of moraine dam to justify the necessity of lake lowering efforts if any in future in other glacier lakes in Nepal Himalaya. For this, we have explored the root causes of rapid lake expansion along with crucial factors responsible for the stability of moraine mass. This research helps to understand the structure of moraine dam and the ice, water and moraine interactions to the strength of moraine dam. The nature of permafrost layer and its effects on moraine dam stability is also studied here. The detail Geo-Technical properties of moraine mass of Imja lake gives a clear picture of the strength of the moraine material and their interactions. The stability analysis of the moraine dam under the consideration of strong ground motion of 7.8Mw 2015 Barpak-Gorkha and its major aftershock 7.3Mw Kodari, Sindhupalchowk-Dolakha border, Nepal earthquakes have also been carried out here to understand the necessity of lake lowering efforts. The lake lowering effort was recently done by Nepal Army by constructing an open channel and lowered 3m. And, it is believed that the entire region is now safe due to continuous draining of lake water by 3m. But, this option does not seem adequate to offer a significant risk reduction to downstream communities in this much amount of volume and depth, lowering as in the 75 million cubic meter water impounded with an average depth of 148.9m.

Keywords: finite element method, glacier, moraine, stability

Procedia PDF Downloads 213
22687 Remote Sensing of Aerated Flows at Large Dams: Proof of Concept

Authors: Ahmed El Naggar, Homyan Saleh

Abstract:

Dams are crucial for flood control, water supply, and the creation of hydroelectric power. Every dam has a water conveyance system, such as a spillway, providing the safe discharge of catastrophic floods when necessary. Spillway design has historically been investigated in laboratory research owing to the absence of suitable full-scale flow monitoring equipment and safety problems. Prototype measurements of aerated flows are urgently needed to quantify projected scale effects and provide missing validation data for design guidelines and numerical simulations. In this work, an image-based investigation of free-surface flows on a tiered spillway was undertaken at the laboratory (fixed camera installation) and prototype size (drone video) (drone footage) (drone footage). The drone videos were generated using data from citizen science. Analyses permitted the measurement of the free-surface aeration inception point, air-water surface velocities, fluctuations, and residual energy at the chute's downstream end from a remote site. The prototype observations offered full-scale proof of concept, while laboratory results were efficiently confirmed against invasive phase-detection probe data. This paper stresses the efficacy of image-based analyses at prototype spillways. It highlights how citizen science data may enable academics better understand real-world air-water flow dynamics and offers a framework for a small collection of long-missing prototype data.

Keywords: remote sensing, aerated flows, large dams, proof of concept, dam spillways, air-water flows, prototype operation, remote sensing, inception point, optical flow, turbulence, residual energy

Procedia PDF Downloads 92
22686 Commentary on Successful and Emerging Bullying Control Programs: A Comparison between Eighteen Bullying Interventions Applied Worldwide

Authors: Sohni Siddiqui, Anja Schultze-Krumbholz

Abstract:

Our lives now revolve more around online-related tasks, as the internet has become a necessity. One of the disturbance concerns with high internet usage is the multiplication of cyber-associated risky behaviors such as cyber aggression and/or cyberbullying. Cyber Bullying is an emerging issue that needs immediate attention from many stakeholders such as parents, doctors, school administrators, policymakers, researchers, and others, especially in the COVID-19 pandemic when online learning has been adopted as an instructional strategy, and there is a continuous rise in cyberbullying cases. The aim of the article is to review existing successful and emerging interventions designed to control bullying and cyberbullying by engaging individuals through teachers’ professional development and adopting a whole-school approach. The study identified the strengths and limitations of the programs and suggested improvements to existing interventions. Preparing interventions with a strong theoretical framework, integrating applications of emerging theories in interventions, promoting proactive and reactive strategies in combination, beginning with the baseline needs assessment surveys, reducing digital time and digital divide among parents and children, promoting the concept of lead trainer, peer trainer, and hot spots, focusing on physical activities, use of landmarks are some of the recommendations proposed by authors. In addition to face-to-face intervention, the researchers recommend updating and improving previous intervention programs with games and apps. Especially in the time of pandemic crises, when face-to-face interactions are limited and cyberbullying is triggered, the use of apps, web-based interventions, and games can be an effective way to control electronic perpetration and victimization.

Keywords: anti bullying programs, cyber bullying, individualized trainings, teachers’ professional development, whole school interventions

Procedia PDF Downloads 151
22685 Design and Development of a Mechanical Force Gauge for the Square Watermelon Mold

Authors: Morteza Malek Yarand, Hadi Saebi Monfared

Abstract:

This study aimed at designing and developing a mechanical force gauge for the square watermelon mold for the first time. It also tried to introduce the square watermelon characteristics and its production limitations. The mechanical force gauge performance and the product itself were also described. There are three main designable gauge models: a. hydraulic gauge, b. strain gauge, and c. mechanical gauge. The advantage of the hydraulic model is that it instantly displays the pressure and thus the force exerted by the melon. However, considering the inability to measure forces at all directions, complicated development, high cost, possible hydraulic fluid leak into the fruit chamber and the possible influence of increased ambient temperature on the fluid pressure, the development of this gauge was overruled. The second choice was to calculate pressure using the direct force a strain gauge. The main advantage of these strain gauges over spring types is their high precision in measurements; but with regard to the lack of conformity of strain gauge working range with water melon growth, calculations were faced with problems. Finally the mechanical pressure gauge has advantages, including the ability to measured forces and pressures on the mold surface during melon growth; the ability to display the peak forces; the ability to produce melon growth graph thanks to its continuous force measurements; the conformity of its manufacturing materials with the required physical conditions of melon growth; high air conditioning capability; the ability to permit sunlight reaches the melon rind (no yellowish skin and quality loss); fast and straightforward calibration; no damages to the product during assembling and disassembling; visual check capability of the product within the mold; applicable to all growth environments (field, greenhouses, etc.); simple process; low costs and so forth.

Keywords: mechanical force gauge, mold, reshaped fruit, square watermelon

Procedia PDF Downloads 273
22684 Testing Causal Model of Depression Based on the Components of Subscales Lifestyle with Mediation of Social Health

Authors: Abdolamir Gatezadeh, Jamal Daghaleh

Abstract:

The lifestyle of individuals is important and determinant for the status of psychological and social health. Recently, especially in developed countries, the relationship between lifestyle and mental illnesses, including depression, has attracted the attention of many people. In order to test the causal model of depression based on lifestyle with mediation of social health in the study, basic and applied methods were used in terms of objective and descriptive-field as well as the data collection. Methods: This study is a basic research type and is in the framework of correlational plans. In this study, the population includes all adults in Ahwaz city. A randomized, multistage sampling of 384 subjects was selected as the subjects. Accordingly, the data was collected and analyzed using structural equation modeling. Results: In data analysis, path analysis indicated the confirmation of the assumed model fit of research. This means that subscales lifestyle has a direct effect on depression and subscales lifestyle through the mediation of social health which in turn has an indirect effect on depression. Discussion and conclusion: According to the results of the research, the depression can be used to explain the components of the lifestyle and social health.

Keywords: depression, subscales lifestyle, social health, causal model

Procedia PDF Downloads 163
22683 Landslide Susceptibility Analysis in the St. Lawrence Lowlands Using High Resolution Data and Failure Plane Analysis

Authors: Kevin Potoczny, Katsuichiro Goda

Abstract:

The St. Lawrence lowlands extend from Ottawa to Quebec City and are known for large deposits of sensitive Leda clay. Leda clay deposits are responsible for many large landslides, such as the 1993 Lemieux and 2010 St. Jude (4 fatalities) landslides. Due to the large extent and sensitivity of Leda clay, regional hazard analysis for landslides is an important tool in risk management. A 2018 regional study by Farzam et al. on the susceptibility of Leda clay slopes to landslide hazard uses 1 arc second topographical data. A qualitative method known as Hazus is used to estimate susceptibility by checking for various criteria in a location and determine a susceptibility rating on a scale of 0 (no susceptibility) to 10 (very high susceptibility). These criteria are slope angle, geological group, soil wetness, and distance from waterbodies. Given the flat nature of St. Lawrence lowlands, the current assessment fails to capture local slopes, such as the St. Jude site. Additionally, the data did not allow one to analyze failure planes accurately. This study majorly improves the analysis performed by Farzam et al. in two aspects. First, regional assessment with high resolution data allows for identification of local locations that may have been previously identified as low susceptibility. This then provides the opportunity to conduct a more refined analysis on the failure plane of the slope. Slopes derived from 1 arc second data are relatively gentle (0-10 degrees) across the region; however, the 1- and 2-meter resolution 2022 HRDEM provided by NRCAN shows that short, steep slopes are present. At a regional level, 1 arc second data can underestimate the susceptibility of short, steep slopes, which can be dangerous as Leda clay landslides behave retrogressively and travel upwards into flatter terrain. At the location of the St. Jude landslide, slope differences are significant. 1 arc second data shows a maximum slope of 12.80 degrees and a mean slope of 4.72 degrees, while the HRDEM data shows a maximum slope of 56.67 degrees and a mean slope of 10.72 degrees. This equates to a difference of three susceptibility levels when the soil is dry and one susceptibility level when wet. The use of GIS software is used to create a regional susceptibility map across the St. Lawrence lowlands at 1- and 2-meter resolutions. Failure planes are necessary to differentiate between small and large landslides, which have so far been ignored in regional analysis. Leda clay failures can only retrogress as far as their failure planes, so the regional analysis must be able to transition smoothly into a more robust local analysis. It is expected that slopes within the region, once previously assessed at low susceptibility scores, contain local areas of high susceptibility. The goal is to create opportunities for local failure plane analysis to be undertaken, which has not been possible before. Due to the low resolution of previous regional analyses, any slope near a waterbody could be considered hazardous. However, high-resolution regional analysis would allow for more precise determination of hazard sites.

Keywords: hazus, high-resolution DEM, leda clay, regional analysis, susceptibility

Procedia PDF Downloads 78
22682 Integrative Omics-Portrayal Disentangles Molecular Heterogeneity and Progression Mechanisms of Cancer

Authors: Binder Hans

Abstract:

Cancer is no longer seen as solely a genetic disease where genetic defects such as mutations and copy number variations affect gene regulation and eventually lead to aberrant cell functioning which can be monitored by transcriptome analysis. It has become obvious that epigenetic alterations represent a further important layer of (de-)regulation of gene activity. For example, aberrant DNA methylation is a hallmark of many cancer types, and methylation patterns were successfully used to subtype cancer heterogeneity. Hence, unraveling the interplay between different omics levels such as genome, transcriptome and epigenome is inevitable for a mechanistic understanding of molecular deregulation causing complex diseases such as cancer. This objective requires powerful downstream integrative bioinformatics methods as an essential prerequisite to discover the whole genome mutational, transcriptome and epigenome landscapes of cancer specimen and to discover cancer genesis, progression and heterogeneity. Basic challenges and tasks arise ‘beyond sequencing’ because of the big size of the data, their complexity, the need to search for hidden structures in the data, for knowledge mining to discover biological function and also systems biology conceptual models to deduce developmental interrelations between different cancer states. These tasks are tightly related to cancer biology as an (epi-)genetic disease giving rise to aberrant genomic regulation under micro-environmental control and clonal evolution which leads to heterogeneous cellular states. Machine learning algorithms such as self organizing maps (SOM) represent one interesting option to tackle these bioinformatics tasks. The SOMmethod enables recognizing complex patterns in large-scale data generated by highthroughput omics technologies. It portrays molecular phenotypes by generating individualized, easy to interpret images of the data landscape in combination with comprehensive analysis options. Our image-based, reductionist machine learning methods provide one interesting perspective how to deal with massive data in the discovery of complex diseases, gliomas, melanomas and colon cancer on molecular level. As an important new challenge, we address the combined portrayal of different omics data such as genome-wide genomic, transcriptomic and methylomic ones. The integrative-omics portrayal approach is based on the joint training of the data and it provides separate personalized data portraits for each patient and data type which can be analyzed by visual inspection as one option. The new method enables an integrative genome-wide view on the omics data types and the underlying regulatory modes. It is applied to high and low-grade gliomas and to melanomas where it disentangles transversal and longitudinal molecular heterogeneity in terms of distinct molecular subtypes and progression paths with prognostic impact.

Keywords: integrative bioinformatics, machine learning, molecular mechanisms of cancer, gliomas and melanomas

Procedia PDF Downloads 148
22681 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 469
22680 Multi-Criteria Decision Approach to Performance Measurement Techniques Data Envelopment Analysis: Case Study of Kerman City’s Parks

Authors: Ali A. Abdollahi

Abstract:

During the last several decades, scientists have consistently applied Multiple Criteria Decision-Making methods in making decisions about multi-faceted, complicated subjects. While making such decisions and in order to achieve more accurate evaluations, they have regularly used a variety of criteria instead of applying just one Optimum Evaluation Criterion. The method presented here utilizes both ‘quantity’ and ‘quality’ to assess the function of the Multiple-Criteria method. Applying Data envelopment analysis (DEA), weighted aggregated sum product assessment (WASPAS), Weighted Sum Approach (WSA), Analytic Network Process (ANP), and Charnes, Cooper, Rhodes (CCR) methods, we have analyzed thirteen parks in Kerman city. It further indicates that the functions of WASPAS and WSA are compatible with each other, but also that their deviation from DEA is extensive. Finally, the results for the CCR technique do not match the results of the DEA technique. Our study indicates that the ANP method, with the average rate of 1/51, ranks closest to the DEA method, which has an average rate of 1/49.

Keywords: multiple criteria decision making, Data envelopment analysis (DEA), Charnes Cooper Rhodes (CCR), Weighted Sum Approach (WSA)

Procedia PDF Downloads 219
22679 Simulation-Based Optimization Approach for an Electro-Plating Production Process Based on Theory of Constraints and Data Envelopment Analysis

Authors: Mayada Attia Ibrahim

Abstract:

Evaluating and developing the electroplating production process is a key challenge in this type of process. The process is influenced by several factors such as process parameters, process costs, and production environments. Analyzing and optimizing all these factors together requires extensive analytical techniques that are not available in real-case industrial entities. This paper presents a practice-based framework for the evaluation and optimization of some of the crucial factors that affect the costs and production times associated with this type of process, energy costs, material costs, and product flow times. The proposed approach uses Design of Experiments, Discrete-Event Simulation, and Theory of Constraints were respectively used to identify the most significant factors affecting the production process and simulate a real production line to recognize the effect of these factors and assign possible bottlenecks. Several scenarios are generated as corrective strategies for improving the production line. Following that, data envelopment analysis CCR input-oriented DEA model is used to evaluate and optimize the suggested scenarios.

Keywords: electroplating process, simulation, design of experiment, performance optimization, theory of constraints, data envelopment analysis

Procedia PDF Downloads 98
22678 E4D-MP: Time-Lapse Multiphysics Simulation and Joint Inversion Toolset for Large-Scale Subsurface Imaging

Authors: Zhuanfang Fred Zhang, Tim C. Johnson, Yilin Fang, Chris E. Strickland

Abstract:

A variety of geophysical techniques are available to image the opaque subsurface with little or no contact with the soil. It is common to conduct time-lapse surveys of different types for a given site for improved results of subsurface imaging. Regardless of the chosen survey methods, it is often a challenge to process the massive amount of survey data. The currently available software applications are generally based on the one-dimensional assumption for a desktop personal computer. Hence, they are usually incapable of imaging the three-dimensional (3D) processes/variables in the subsurface of reasonable spatial scales; the maximum amount of data that can be inverted simultaneously is often very small due to the capability limitation of personal computers. Presently, high-performance or integrating software that enables real-time integration of multi-process geophysical methods is needed. E4D-MP enables the integration and inversion of time-lapsed large-scale data surveys from geophysical methods. Using the supercomputing capability and parallel computation algorithm, E4D-MP is capable of processing data across vast spatiotemporal scales and in near real time. The main code and the modules of E4D-MP for inverting individual or combined data sets of time-lapse 3D electrical resistivity, spectral induced polarization, and gravity surveys have been developed and demonstrated for sub-surface imaging. E4D-MP provides capability of imaging the processes (e.g., liquid or gas flow, solute transport, cavity development) and subsurface properties (e.g., rock/soil density, conductivity) critical for successful control of environmental engineering related efforts such as environmental remediation, carbon sequestration, geothermal exploration, and mine land reclamation, among others.

Keywords: gravity survey, high-performance computing, sub-surface monitoring, electrical resistivity tomography

Procedia PDF Downloads 157
22677 Dynamic Analysis and Vibration Response of Thermoplastic Rolling Elements in a Rotor Bearing System

Authors: Nesrine Gaaliche

Abstract:

This study provides a finite element dynamic model for analyzing rolling bearing system vibration response. The vibration responses of polypropylene bearings with and without defects are studied using FE analysis and compared to experimental data. The viscoelastic behavior of thermoplastic is investigated in this work to evaluate the influence of material flexibility and damping viscosity. The vibrations are detected using 3D dynamic analysis. Peak vibrations are more noticeable in an inner ring defect than in an outer ring defect, according to test data. The performance of thermoplastic bearings is compared to that of metal parts using vibration signals. Both the test and numerical results show that Polypropylene bearings exhibit less vibration than steel counterparts. Unlike bearings made from metal, polypropylene bearings absorb vibrations and handle shaft misalignments. Following validation of the overall vibration spectrum data, Von Mises stresses inside the rings are assessed under high loads. Stress is significantly high under the balls, according to the simulation findings. For the test cases, the computational findings correspond closely to the experimental results.

Keywords: viscoelastic, FE analysis, polypropylene, bearings

Procedia PDF Downloads 106
22676 Airborne Pollutants and Lung Surfactant: Biophysical Impacts of Surface Oxidation Reactions

Authors: Sahana Selladurai, Christine DeWolf

Abstract:

Lung surfactant comprises a lipid-protein film that coats the alveolar surface and serves to prevent alveolar collapse upon repeated breathing cycles. Exposure of lung surfactant to high concentrations of airborne pollutants, for example tropospheric ozone in smog, can chemically modify the lipid and protein components. These chemical changes can impact the film functionality by decreasing the film’s collapse pressure (minimum surface tension attainable), altering it is mechanical and flow properties and modifying lipid reservoir formation essential for re-spreading of the film during the inhalation process. In this study, we use Langmuir monolayers spread at the air-water interface as model membranes where the compression and expansion of the film mimics the breathing cycle. The impact of ozone exposure on model lung surfactant films is measured using a Langmuir film balance, Brewster angle microscopy and a pendant drop tensiometer as a function of film and sub-phase composition. The oxidized films are analyzed using mass spectrometry where lipid and protein oxidation products are observed. Oxidation is shown to reduce surface activity, alter line tension (and film morphology) and in some cases visibly reduce the viscoelastic properties of the film when compared to controls. These reductions in functionality of the films are highly dependent on film and sub-phase composition, where for example, the effect of oxidation is more pronounced when using a physiologically relevant buffer as opposed to water as the sub-phase. These findings can lead to a better understanding on the impact of continuous exposure to high levels of ozone on the mechanical process of breathing, as well as understanding the roles of certain lung surfactant components in this process.

Keywords: lung surfactant, oxidation, ozone, viscoelasticity

Procedia PDF Downloads 311
22675 Mayan Culture and Attitudes towards Sustainability

Authors: Sarah Ryu

Abstract:

Agricultural methods and ecological approaches employed by the pre-colonial Mayans may provide valuable insights into forest management and viable alternatives for resource sustainability in the face of major deforestation across Central and South America.Using a combination of observation data collected from the modern indigenous inhabitants near Mixco in Guatemala and historical data, this study was able to create a holistic picture of how the Maya maintained their ecosystems. Surveys and observations were conducted in the field, over a period of twelve weeks across two years. Geographic and archaeological data for this area was provided by Guatemalan organizations such as the Universidad de San Carlos de Guatemala. Observations of current indigenous populations around Mixco showed that they adhered to traditional Mayan methods of agriculture, such as terrace construction and arboriculture. Rather than planting one cash crop as was done by the Spanish, indigenous peoples practice agroforestry, cultivating forests that would provide trees for construction material, wild plant foods, habitat for game, and medicinal herbs. The emphasis on biodiversity prevented deforestation and created a sustainable balance between human consumption and forest regrowth. Historical data provided by MayaSim showed that the Mayans successfully maintained their ecosystems from about 800BCE to 700CE. When the Mayans practiced natural resource conservation and cultivated a harmonious relationship with the forest around them, they were able to thrive and prosper alongside nature. Having lasted over a thousand years, the Mayan empire provides a valuable lesson in sustainability and human attitudes towards the environment.

Keywords: biodiversity, forestry, mayan, sustainability

Procedia PDF Downloads 177
22674 Advanced Lithium Recovery from Brine: 2D-Based Ion Selectivity Membranes

Authors: Nour S. Abdelrahman, Seunghyun Hong, Hassan A. Arafat, Daniel Choi, Faisal Al Marzooqi

Abstract:

Abstract—The advancement of lithium extraction methods from water sources, particularly saltwater brine, is gaining prominence in the lithium recovery industry due to its cost-effectiveness. Traditional techniques like recrystallization, chemical precipitation, and solvent extraction for metal recovery from seawater or brine are energy-intensive and exhibit low efficiency. Moreover, the extensive use of organic solvents poses environmental concerns. As a result, there's a growing demand for environmentally friendly lithium recovery methods. Membrane-based separation technology has emerged as a promising alternative, offering high energy efficiency and ease of continuous operation. In our study, we explored the potential of lithium-selective sieve channels constructed from layers of 2D graphene oxide and MXene (transition metal carbides and nitrides), integrated with surface – SO₃₋ groups. The arrangement of these 2D sheets creates interplanar spacing ranging from 0.3 to 0.8 nm, which forms a barrier against multivalent ions while facilitating lithium-ion movement through nano capillaries. The introduction of the sulfonate group provides an effective pathway for Li⁺ ions, with a calculated binding energy of Li⁺ – SO³⁻ at – 0.77 eV, the lowest among monovalent species. These modified membranes demonstrated remarkably rapid transport of Li⁺ ions, efficiently distinguishing them from other monovalent and divalent species. This selectivity is achieved through a combination of size exclusion and varying binding affinities. The graphene oxide channels in these membranes showed exceptional inter-cation selectivity, with a Li⁺/Mg²⁺ selectivity ratio exceeding 104, surpassing commercial membranes. Additionally, these membranes achieved over 94% rejection of MgCl₂.

Keywords: ion permeation, lithium extraction, membrane-based separation, nanotechnology

Procedia PDF Downloads 73