Search results for: time series data mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38110

Search results for: time series data mining

32920 Multi-Objective Simulated Annealing Algorithms for Scheduling Just-In-Time Assembly Lines

Authors: Ghorbanali Mohammadi

Abstract:

New approaches to sequencing mixed-model manufacturing systems are present. These approaches have attracted considerable attention due to their potential to deal with difficult optimization problems. This paper presents Multi-Objective Simulated Annealing Algorithms (MOSAA) approaches to the Just-In-Time (JIT) sequencing problem where workload-smoothing (WL) and the number of set-ups (St) are to be optimized simultaneously. Mixed-model assembly lines are types of production lines where varieties of product models similar in product characteristics are assembled. Moreover, this type of problem is NP-hard. Two annealing methods are proposed to solve the multi-objective problem and find an efficient frontier of all design configurations. The performances of the two methods are tested on several problems from the literature. Experimentation demonstrates the relative desirable performance of the presented methodology.

Keywords: scheduling, just-in-time, mixed-model assembly line, sequencing, simulated annealing

Procedia PDF Downloads 112
32919 A Quick Method for Seismic Vulnerability Evaluation of Offshore Structures by Static and Dynamic Nonlinear Analyses

Authors: Somayyeh Karimiyan

Abstract:

To evaluate the seismic vulnerability of vital offshore structures with the highest possible precision, Nonlinear Time History Analyses (NLTHA), is the most reliable method. However, since it is very time-consuming, a quick procedure is greatly desired. This paper presents a quick method by combining the Push Over Analysis (POA) and the NLTHA. The POA is preformed first to recognize the more critical members, and then the NLTHA is performed to evaluate more precisely the critical members’ vulnerability. The proposed method has been applied to jacket type structure. Results show that combining POA and NLTHA is a reliable seismic evaluation method, and also that none of the earthquake characteristics alone, can be a dominant factor in vulnerability evaluation.

Keywords: jacket structure, seismic evaluation, push-over and nonlinear time history analyses, critical members

Procedia PDF Downloads 271
32918 A Mechanical Diagnosis Method Based on Vibration Fault Signal down-Sampling and the Improved One-Dimensional Convolutional Neural Network

Authors: Bowei Yuan, Shi Li, Liuyang Song, Huaqing Wang, Lingli Cui

Abstract:

Convolutional neural networks (CNN) have received extensive attention in the field of fault diagnosis. Many fault diagnosis methods use CNN for fault type identification. However, when the amount of raw data collected by sensors is massive, the neural network needs to perform a time-consuming classification task. In this paper, a mechanical fault diagnosis method based on vibration signal down-sampling and the improved one-dimensional convolutional neural network is proposed. Through the robust principal component analysis, the low-rank feature matrix of a large amount of raw data can be separated, and then down-sampling is realized to reduce the subsequent calculation amount. In the improved one-dimensional CNN, a smaller convolution kernel is used to reduce the number of parameters and computational complexity, and regularization is introduced before the fully connected layer to prevent overfitting. In addition, the multi-connected layers can better generalize classification results without cumbersome parameter adjustments. The effectiveness of the method is verified by monitoring the signal of the centrifugal pump test bench, and the average test accuracy is above 98%. When compared with the traditional deep belief network (DBN) and support vector machine (SVM) methods, this method has better performance.

Keywords: fault diagnosis, vibration signal down-sampling, 1D-CNN

Procedia PDF Downloads 114
32917 Standardized Description and Modeling Methods of Semiconductor IP Interfaces

Authors: Seongsoo Lee

Abstract:

IP reuse is an effective design methodology for modern SoC design to reduce effort and time. However, description and modeling methods of IP interfaces are different due to different IP designers. In this paper, standardized description and modeling methods of IP interfaces are proposed. It consists of 11 items such as IP information, model provision, data type, description level, interface information, port information, signal information, protocol information, modeling level, modeling information, and source file. The proposed description and modeling methods enables easy understanding, simulation, verification, and modification in IP reuse.

Keywords: interface, standardization, description, modeling, semiconductor IP

Procedia PDF Downloads 487
32916 Viscoelastic Modeling of Hot Mix Asphalt (HMA) under Repeated Loading by Using Finite Element Method

Authors: S. A. Tabatabaei, S. Aarabi

Abstract:

Predicting the hot mix asphalt (HMA) response and performance is a challenging task because of the subjectivity of HMA under the complex loading and environmental condition. The behavior of HMA is a function of temperature of loading and also shows the time and rate-dependent behavior directly affecting design criteria of mixture. Velocity of load passing make the time and rate. The viscoelasticity illustrates the reaction of HMA under loading and environmental conditions such as temperature and moisture effect. The behavior has direct effect on design criteria such as tensional strain and vertical deflection. In this paper, the computational framework for viscoelasticity and implementation in 3D dimensional HMA model is introduced to use in finite element method. The model was lied under various repeated loading conditions at constant temperature. The response of HMA viscoelastic behavior is investigated in loading condition under speed vehicle and sensitivity of behavior to the range of speed and compared to HMA which is supposed to have elastic behavior as in conventional design methods. The results show the importance of loading time pulse, unloading time and various speeds on design criteria. Also the importance of memory fading of material to storing the strain and stress due to repeated loading was shown. The model was simulated by ABAQUS finite element package

Keywords: viscoelasticity, finite element method, repeated loading, HMA

Procedia PDF Downloads 385
32915 Mechanism for Network Security via Routing Protocols Estimated with Network Simulator 2 (NS-2)

Authors: Rashid Mahmood, Muhammad Sufyan, Nasir Ahmed

Abstract:

The MANETs have lessened transportation and decentralized network. There are numerous basis of routing protocols. We derived the MANETs protocol into three major categories like Reactive, Proactive and hybrid. In these protocols, we discussed only some protocols like Distance Sequenced Distance Vector (DSDV), Ad hoc on Demand Distance Vector (AODV) and Dynamic Source Routing (DSR). The AODV and DSR are both reactive type of protocols. On the other hand, DSDV is proactive type protocol here. We compare these routing protocols for network security estimated by network simulator (NS-2). In this dissertation some parameters discussed such as simulation time, packet size, number of node, packet delivery fraction, push time and speed etc. We will construct all these parameters on routing protocols under suitable conditions for network security measures.

Keywords: DSDV, AODV, DSR NS-2, PDF, push time

Procedia PDF Downloads 419
32914 Design and Implementation of a Geodatabase and WebGIS

Authors: Sajid Ali, Dietrich Schröder

Abstract:

The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.

Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application

Procedia PDF Downloads 322
32913 Brain-Computer Interface Based Real-Time Control of Fixed Wing and Multi-Rotor Unmanned Aerial Vehicles

Authors: Ravi Vishwanath, Saumya Kumaar, S. N. Omkar

Abstract:

Brain-computer interfacing (BCI) is a technology that is almost four decades old, and it was developed solely for the purpose of developing and enhancing the impact of neuroprosthetics. However, in the recent times, with the commercialization of non-invasive electroencephalogram (EEG) headsets, the technology has seen a wide variety of applications like home automation, wheelchair control, vehicle steering, etc. One of the latest developed applications is the mind-controlled quadrotor unmanned aerial vehicle. These applications, however, do not require a very high-speed response and give satisfactory results when standard classification methods like Support Vector Machine (SVM) and Multi-Layer Perceptron (MLPC). Issues are faced when there is a requirement for high-speed control in the case of fixed-wing unmanned aerial vehicles where such methods are rendered unreliable due to the low speed of classification. Such an application requires the system to classify data at high speeds in order to retain the controllability of the vehicle. This paper proposes a novel method of classification which uses a combination of Common Spatial Paradigm and Linear Discriminant Analysis that provides an improved classification accuracy in real time. A non-linear SVM based classification technique has also been discussed. Further, this paper discusses the implementation of the proposed method on a fixed-wing and VTOL unmanned aerial vehicles.

Keywords: brain-computer interface, classification, machine learning, unmanned aerial vehicles

Procedia PDF Downloads 266
32912 Integration of “FAIR” Data Principles in Longitudinal Mental Health Research in Africa: Lessons from a Landscape Analysis

Authors: Bylhah Mugotitsa, Jim Todd, Agnes Kiragga, Jay Greenfield, Evans Omondi, Lukoye Atwoli, Reinpeter Momanyi

Abstract:

The INSPIRE network aims to build an open, ethical, sustainable, and FAIR (Findable, Accessible, Interoperable, Reusable) data science platform, particularly for longitudinal mental health (MH) data. While studies have been done at the clinical and population level, there still exists limitations in data and research in LMICs, which pose a risk of underrepresentation of mental disorders. It is vital to examine the existing longitudinal MH data, focusing on how FAIR datasets are. This landscape analysis aimed to provide both overall level of evidence of availability of longitudinal datasets and degree of consistency in longitudinal studies conducted. Utilizing prompters proved instrumental in streamlining the analysis process, facilitating access, crafting code snippets, categorization, and analysis of extensive data repositories related to depression, anxiety, and psychosis in Africa. While leveraging artificial intelligence (AI), we filtered through over 18,000 scientific papers spanning from 1970 to 2023. This AI-driven approach enabled the identification of 228 longitudinal research papers meeting inclusion criteria. Quality assurance revealed 10% incorrectly identified articles and 2 duplicates, underscoring the prevalence of longitudinal MH research in South Africa, focusing on depression. From the analysis, evaluating data and metadata adherence to FAIR principles remains crucial for enhancing accessibility and quality of MH research in Africa. While AI has the potential to enhance research processes, challenges such as privacy concerns and data security risks must be addressed. Ethical and equity considerations in data sharing and reuse are also vital. There’s need for collaborative efforts across disciplinary and national boundaries to improve the Findability and Accessibility of data. Current efforts should also focus on creating integrated data resources and tools to improve Interoperability and Reusability of MH data. Practical steps for researchers include careful study planning, data preservation, machine-actionable metadata, and promoting data reuse to advance science and improve equity. Metrics and recognition should be established to incentivize adherence to FAIR principles in MH research

Keywords: longitudinal mental health research, data sharing, fair data principles, Africa, landscape analysis

Procedia PDF Downloads 56
32911 Seismic Behavior of Existing Reinforced Concrete Buildings in California under Mainshock-Aftershock Scenarios

Authors: Ahmed Mantawy, James C. Anderson

Abstract:

Numerous cases of earthquakes (main-shocks) that were followed by aftershocks have been recorded in California. In 1992 a pair of strong earthquakes occurred within three hours of each other in Southern California. The first shock occurred near the community of Landers and was assigned a magnitude of 7.3 then the second shock occurred near the city of Big Bear about 20 miles west of the initial shock and was assigned a magnitude of 6.2. In the same year, a series of three earthquakes occurred over two days in the Cape-Mendocino area of Northern California. The main-shock was assigned a magnitude of 7.0 while the second and the third shocks were both assigned a value of 6.6. This paper investigates the effect of a main-shock accompanied with aftershocks of significant intensity on reinforced concrete (RC) frame buildings to indicate nonlinear behavior using PERFORM-3D software. A 6-story building in San Bruno and a 20-story building in North Hollywood were selected for the study as both of them have RC moment resisting frame systems. The buildings are also instrumented at multiple floor levels as a part of the California Strong Motion Instrumentation Program (CSMIP). Both buildings have recorded responses during past events such as Loma-Prieta and Northridge earthquakes which were used in verifying the response parameters of the numerical models in PERFORM-3D. The verification of the numerical models shows good agreement between the calculated and the recorded response values. Then, different scenarios of a main-shock followed by a series of aftershocks from real cases in California were applied to the building models in order to investigate the structural behavior of the moment-resisting frame system. The behavior was evaluated in terms of the lateral floor displacements, the ductility demands, and the inelastic behavior at critical locations. The analysis results showed that permanent displacements may have happened due to the plastic deformation during the main-shock that can lead to higher displacements during after-shocks. Also, the inelastic response at plastic hinges during the main-shock can change the hysteretic behavior during the aftershocks. Higher ductility demands can also occur when buildings are subjected to trains of ground motions compared to the case of individual ground motions. A general conclusion is that the occurrence of aftershocks following an earthquake can lead to increased damage within the elements of an RC frame buildings. Current code provisions for seismic design do not consider the probability of significant aftershocks when designing a new building in zones of high seismic activity.

Keywords: reinforced concrete, existing buildings, aftershocks, damage accumulation

Procedia PDF Downloads 273
32910 Acid Mine Drainage Remediation Using Silane and Phosphate Coatings

Authors: M. Chiliza, H. P. Mbukwane, P Masita, H. Rutto

Abstract:

Acid mine drainage (AMD) one of the main pollutants of water in many countries that have mining activities. AMD results from the oxidation of pyrite and other metal sulfides. When these metals gets exposed to moisture and oxygen, leaching takes place releasing sulphate and Iron. Acid drainage is often noted by 'yellow boy,' an orange-yellow substance that occurs when the pH of acidic mine-influenced water raises above pH 3, so that the previously dissolved iron precipitates out. The possibility of using environmentally friendly silane and phosphate based coatings on pyrite to remediate acid mine drainage and prevention at source was investigated. The results showed that both coatings reduced chemical oxidation of pyrite based on Fe and sulphate release. Furthermore, it was found that silane based coating performs better when coating synthesis take place in a basic hydrolysis than in an acidic state.

Keywords: acid mine drainage, pyrite, silane, phosphate

Procedia PDF Downloads 331
32909 Impact of Natural Degradation of Low Density Polyethylene on Its Morphology

Authors: Meryem Imane Babaghayou, Asma Abdelhafidi, Salem Fouad Chabira, Mohammed Sebaa

Abstract:

A challenge of plastics industries is the realization of materials that resist the degradation in its application environment, and that to guarantee a longer life time therefore an optimal time of use. Blown extruded films of low-density polyethylene (LDPE) supplied by SABIC SAUDI ARABIA blown and extruded in SOFIPLAST company in Setif ALGERIA , have been subjected to climatic ageing in a sub-Saharan facility at Laghouat (Algeria) with direct exposure to sun. Samples were characterized by X-ray diffraction (XRD) and differential scanning calorimetry (DSC) techniques after prescribed amounts of time up to 8 months. It has been shown via these two techniques the impact of UV irradiation on the morphological development of a plastic material, especially the crystallinity degree which increases with exposure time. The reason of these morphological changes is related to photooxidative reactions leading to cross linking in the beginning and to chain scissions for an advanced stage of ageing this last ones are the first responsible. The crystallinity degree change is essentially controlled by the secondary crystallization of the amorphous chains whose mobility is enhanced by the chain scission processes. The diffusion of these short segments integrates the surface of the lamellae increasing in this way their thicknesses. The results presented highlight the complexity of the involved phenomena.

Keywords: Low Density poly (Ethylene), crystallinity, ageing, XRD, DSC

Procedia PDF Downloads 394
32908 Design of Evaluation for Ehealth Intervention: A Participatory Study in Italy, Israel, Spain and Sweden

Authors: Monika Jurkeviciute, Amia Enam, Johanna Torres Bonilla, Henrik Eriksson

Abstract:

Introduction: Many evaluations of eHealth interventions conclude that the evidence for improved clinical outcomes is limited, especially when the intervention is short, such as one year. Often, evaluation design does not address the feasibility of achieving clinical outcomes. Evaluations are designed to reflect upon clinical goals of intervention without utilizing the opportunity to illuminate effects on organizations and cost. A comprehensive design of evaluation can better support decision-making regarding the effectiveness and potential transferability of eHealth. Hence, the purpose of this paper is to present a feasible and comprehensive design of evaluation for eHealth intervention, including the design process in different contexts. Methodology: The situation of limited feasibility of clinical outcomes was foreseen in the European Union funded project called “DECI” (“Digital Environment for Cognitive Inclusion”) that is run under the “Horizon 2020” program with an aim to define and test a digital environment platform within corresponding care models that help elderly people live independently. A complex intervention of eHealth implementation into elaborate care models in four different countries was planned for one year. To design the evaluation, a participative approach was undertaken using Pettigrew’s lens of change and transformations, including context, process, and content. Through a series of workshops, observations, interviews, and document analysis, as well as a review of scientific literature, a comprehensive design of evaluation was created. Findings: The findings indicate that in order to get evidence on clinical outcomes, eHealth interventions should last longer than one year. The content of the comprehensive evaluation design includes a collection of qualitative and quantitative methods for data gathering which illuminates non-medical aspects. Furthermore, it contains communication arrangements to discuss the results and continuously improve the evaluation design, as well as procedures for monitoring and improving the data collection during the intervention. The process of the comprehensive evaluation design consists of four stages: (1) analysis of a current state in different contexts, including measurement systems, expectations and profiles of stakeholders, organizational ambitions to change due to eHealth integration, and the organizational capacity to collect data for evaluation; (2) workshop with project partners to discuss the as-is situation in relation to the project goals; (3) development of general and customized sets of relevant performance measures, questionnaires and interview questions; (4) setting up procedures and monitoring systems for the interventions. Lastly, strategies are presented on how challenges can be handled during the design process of evaluation in four different countries. The evaluation design needs to consider contextual factors such as project limitations, and differences between pilot sites in terms of eHealth solutions, patient groups, care models, national and organizational cultures and settings. This implies a need for the flexible approach to evaluation design to enable judgment over the effectiveness and potential for adoption and transferability of eHealth. In summary, this paper provides learning opportunities for future evaluation designs of eHealth interventions in different national and organizational settings.

Keywords: ehealth, elderly, evaluation, intervention, multi-cultural

Procedia PDF Downloads 308
32907 The Effects of Drying Technology on Rehydration Time and Quality of Mung Bean Vermicelli

Authors: N. P. Tien, S. Songsermpong, T. H. Quan

Abstract:

Mung bean vermicelli is a popular food in Asian countries and is made from mung bean starch. The preparation process involves several steps, including drying, which affects the structure and quality of the vermicelli. This study aims to examine the effects of different drying technologies on the rehydration time and quality of mung bean vermicelli. Three drying technologies, namely hot air drying, microwave continuous drying, and microwave vacuum drying, were used for the drying process. The vermicelli strands were dried at 45°C for 12h in a hot air dryer, at 70 Hz of conveyor belt speed inverter in a microwave continuous dryer, and at 30 W.g⁻¹ of microwave power density in a microwave vacuum dryer. The results showed that mung bean vermicelli dried using hot air drying had the longest rehydration time of 12.69 minutes. On the other hand, vermicelli dried through microwave continuous drying and microwave vacuum drying had shorter rehydration times of 2.79 minutes and 2.14 minutes, respectively. Microwave vacuum drying also resulted in larger porosity, higher water absorption, and cooking loss. The tensile strength and elasticity of vermicelli dried using hot air drying were higher compared to microwave drying technologies. The sensory evaluation did not reveal significant differences in most attributes among the vermicelli treatments. Overall, microwave drying technology proved to be effective in reducing rehydration time and producing good-quality mung bean vermicelli.

Keywords: mung bean vermicelli, drying, hot air, microwave continuous, microwave vacuum

Procedia PDF Downloads 65
32906 Linking Work-Family Enrichment and Innovative Workplace Behavior: The Mediating Role of Positive Emotions

Authors: Nidhi Bansal, Upasna Agarwal

Abstract:

Innovation is a key driver for economic growth and well-being of developed as well as emerging economies like India. Very few studies examined the relationship between IWB and work-family enrichment. Therefore, the present study examines the relationship between work-family enrichment (WFE) and innovative workplace behavior (IWB) and whether it is mediated by positive emotions. Social exchange theory and broaden and build theory explain the proposed relationships. Data were collected from 250 full time dual working parents in different Indian organizations through a survey questionnaire. Snowball technique was used for approaching respondents. Mediation analysis was assessed through PROCESS macro (Hayes, 2012) in SPSS. With correlational analysis, it was explored that all three variables were significantly and positively related. Analysis suggests that work-family enrichment is significantly related to innovative workplace behavior and this relationship is partially mediated by positive emotions. A cross-sectional design, use of self-reported questions and data collected only from dual working parents are few limitations of the study. This is one of the few studies to examine the innovative workplace behavior in response to work-family enrichment and first attempt to examine the mediation effect of emotions between these two variables.

Keywords: dual working parents, emotions, innovative workplace behavior, work-family enrichment

Procedia PDF Downloads 240
32905 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 53
32904 MiR-103 Inhibits Osteoblast Proliferation Mainly through Suppressing Cav 1.2 Expression in Simulated Microgravity

Authors: Zhongyang Sun, Shu Zhang, Manjiang Xie

Abstract:

Emerging evidence indicates that microRNAs (miRNAs) play important roles in modulating osteoblast function and bone formation. However, the influence of miRNA on osteoblast proliferation and the possible mechanisms underlying remain to be defined. In this study, we aimed to investigate whether miR-103 regulates osteoblast proliferation under simulated microgravity condition through regulating Cav1.2, the primary subunit of L-type voltage sensitive calcium channels (LTCCs). We first investigated the effect of simulated microgravity on osteoblast proliferation and the outcomes clearly demonstrated that the mechanical unloading inhibits MC3T3-E1 osteoblast-like cells proliferation. Using quantitative Real-Time PCR (qRT-PCR), we provided data showing that miR-103 was up-regulated in response to simulated microgravity. In addition, we observed that up-regulation of miR-103 inhibited and down-regulation of miR-103 promoted osteoblast proliferation under simulated microgravity condition. Furthermore, knocking-down or over-expressing miR-103, respectively, up- or down-regulated the level of Cav1.2 expression and LTCCs currents, suggesting that miR-103 acts as an endogenous attenuator of Cav1.2 in osteoblasts under the condition of simulated microgravity. More importantly, we showed that the effect of miR-103 on osteoblast proliferation was diminished in simulated microgravity, when co-transfecting miR-103 mimic or inhibitor with Cav1.2 siRNA. Taken together, our data suggest that miR-103 inhibits osteoblast proliferation mainly through suppression of Cav1.2 expression under simulated microgravity condition. This work may provide a novel mechanism of microgravity-induced detrimental effects on osteoblast, identifying miR-103 as a novel possible therapeutic target in bone remodeling disorders in this mechanical unloading.

Keywords: microRNA, osteoblasts, cell proliferation, Cav1.2, simulated microgravity

Procedia PDF Downloads 350
32903 The Effect of the Earthworm (Lumbricus rubellus) as the Source of Protein Feed and Pathogen Antibacterial for Broiler

Authors: Waode Nurmayani, Nikmatul Riswanda

Abstract:

Broilers are chickens which are kept with the most efficient time and hoped get a good body weight. All things are done, for example with the improvement of feed and use antibiotics. Feed cost is the most cost to be spent. Nearly 80% of the cost is spent just for buy feed. Earthworm (Lumbricus rubellus) is a good choice to reduce the cost of feed protein source. The Earthworm has a high crude protein content of about 48.5%-61.9%, rich with proline amino acid about 15% of the 62 amino acids. Not only about protein, this earthworm also has a role in disease prevention. Prevention of disease in livestock usual with use feed supplement. Earthworm (Lumbricus rubellus) is one of the natural materials used as feed. In addition, several types of earthworms that have been known to contain active substances about antibacterial pathogens namely Lumbricus rubellus. The earthworm could be used as an antibiotic because it contain the antibody of Lumbricine active substance. So that, this animal feed from Lumbricus rubellus could improve the performance of broilers. Bioactive of anti-bacterial is called Lumbricine able to inhibit the growth of pathogenic bacteria in the intestinal wall so that the population of pathogenic bacteria is reduced. The method of write in this scientific writing is divided into 3 techniques, namely data completion, data analysis, and thinking pan from various literature about earthworm (Lumbricus rubellus) as broiler feed. It is expected that innovation of feed material of earthworm (Lumbricus rubellus) could reduce the cost of protein feed and the use of chemical antibiotics.

Keywords: earthworm, broiler, protein, antibiotic

Procedia PDF Downloads 140
32902 A Case-Series Analysis of Tuberculosis in Patients at Internal Medicine Department

Authors: Cherif Y., Ghariani R., Derbal S., Farhati S., Ben Dahmen F., Abdallah M.

Abstract:

Introduction: Tuberculosis (TBC) is a frequent infection and is still a major public health problem in Tunisia. The aim of this work is to focus on diagnostic and therapeutic characteristics of TBC in patients referred to our internal medicine department. Patients and Methods: The study was retrospective and descriptive of a cohort of consecutive cases treated from January 2016 to December 2019, collecting patients with latent or patent TBC. Twenty-eight medical records of adults diagnosed with TBC were reviewed. Results: Twenty-eight patients, including 18 women and 10 men, were diagnosed with TBC. Their mean age is 48 years (range: 22-78 years). Five patients have a medical history of diabetes mellitus, 1 patient was followed for systemic lupus erythematosus treated with corticosteroids and immunosuppressant drugs, and another was treated with corticosteroids for Mac Duffy syndrome. The TBC is latent in 12 cases and patent in 16 cases. The most common symptoms were fever and weight loss and were found in 10 cases, a cough in 2 cases, sputum in 3 cases, lymph nodes in 4 cases, erythema nodosum in 2 cases, and neurological signs in 3 cases. Lymphopenia is noticed in 3 cases and a biological inflammatory syndrome in 18 of the cases. The purified protein derivate reaction was positive in 17 cases, anergic in 3 cases, negative in 5 cases, and not done in 3 cases. The acid-fast bacilli stain culture was strongly positive in one patient. The histopathological study was conclusive in 11 patients and showed granulomatosis with caseous necrosis. TBC was pulmonary in 7 patients, lymph node in 7 cases, peritoneal in 7 cases, digestive in 1 case, neuromeningeal in 3 cases, and thyroïd in 1 case. Seven patients had multifocal TBC. All the patients received anti-tuberculosis treatment with a mean duration of 8 months with no failure or relapse with an average follow-up time of 10.58 months. Conclusion: Diagnosis and management of TBC remain essential to avoid serious complications. The survey is necessary to ensure timely detection and treatment of infected adults to decrease its incidence. The best treatment remains preventive through vaccination and improving social and economic conditions.

Keywords: tuberculosis, infection, autoimmune disease, granulomatosis

Procedia PDF Downloads 173
32901 Scattered Places in Stories Singularity and Pattern in Geographic Information

Authors: I. Pina, M. Painho

Abstract:

Increased knowledge about the nature of place and the conditions under which space becomes place is a key factor for better urban planning and place-making. Although there is a broad consensus on the relevance of this knowledge, difficulties remain in relating the theoretical framework about place and urban management. Issues related to representation of places are among the greatest obstacles to overcome this gap. With this critical discussion, based on literature review, we intended to explore, in a common framework for geographical analysis, the potential of stories to spell out place meanings, bringing together qualitative text analysis and text mining in order to capture and represent the singularity contained in each person's life history, and the patterns of social processes that shape places. The development of this reasoning is based on the extensive geographical thought about place, and in the theoretical advances in the field of Geographic Information Science (GISc).

Keywords: discourse analysis, geographic information science place, place-making, stories

Procedia PDF Downloads 177
32900 Artificial Intelligence and Governance in Relevance to Satellites in Space

Authors: Anwesha Pathak

Abstract:

With the increasing number of satellites and space debris, space traffic management (STM) becomes crucial. AI can aid in STM by predicting and preventing potential collisions, optimizing satellite trajectories, and managing orbital slots. Governance frameworks need to address the integration of AI algorithms in STM to ensure safe and sustainable satellite activities. AI and governance play significant roles in the context of satellite activities in space. Artificial intelligence (AI) technologies, such as machine learning and computer vision, can be utilized to process vast amounts of data received from satellites. AI algorithms can analyse satellite imagery, detect patterns, and extract valuable information for applications like weather forecasting, urban planning, agriculture, disaster management, and environmental monitoring. AI can assist in automating and optimizing satellite operations. Autonomous decision-making systems can be developed using AI to handle routine tasks like orbit control, collision avoidance, and antenna pointing. These systems can improve efficiency, reduce human error, and enable real-time responsiveness in satellite operations. AI technologies can be leveraged to enhance the security of satellite systems. AI algorithms can analyze satellite telemetry data to detect anomalies, identify potential cyber threats, and mitigate vulnerabilities. Governance frameworks should encompass regulations and standards for securing satellite systems against cyberattacks and ensuring data privacy. AI can optimize resource allocation and utilization in satellite constellations. By analyzing user demands, traffic patterns, and satellite performance data, AI algorithms can dynamically adjust the deployment and routing of satellites to maximize coverage and minimize latency. Governance frameworks need to address fair and efficient resource allocation among satellite operators to avoid monopolistic practices. Satellite activities involve multiple countries and organizations. Governance frameworks should encourage international cooperation, information sharing, and standardization to address common challenges, ensure interoperability, and prevent conflicts. AI can facilitate cross-border collaborations by providing data analytics and decision support tools for shared satellite missions and data sharing initiatives. AI and governance are critical aspects of satellite activities in space. They enable efficient and secure operations, ensure responsible and ethical use of AI technologies, and promote international cooperation for the benefit of all stakeholders involved in the satellite industry.

Keywords: satellite, space debris, traffic, threats, cyber security.

Procedia PDF Downloads 54
32899 Achieving Sustainable Rapid Construction Using Lean Principles

Authors: Muhamad Azani Yahya, Vikneswaran Munikanan, Mohammed Alias Yusof

Abstract:

There is the need to take the holistic approach in achieving sustainable construction for a contemporary practice. Sustainable construction is the practice that involved method of human preservation of the environment, whether economically or socially through responsibility, management of resources and maintenance utilizing support. This paper shows the correlation of achieving rapid construction with sustainable concepts using lean principles. Lean principles being used widely in the manufacturing industry, but this research will demonstrate the principles into building construction. Lean principle offers the benefits of stabilizing work flow and elimination of unnecessary work. Therefore, this principle contributes to time and waste reduction. The correlation shows that pulling factor provides the improvement of progress curve and stabilizing the time-quality relation. The finding shows the lean principles offer the elements of rapid construction synchronized with the elements of sustainability.

Keywords: sustainable construction, rapid construction, time reduction, lean construction

Procedia PDF Downloads 225
32898 Applying Participatory Design for the Reuse of Deserted Community Spaces

Authors: Wei-Chieh Yeh, Yung-Tang Shen

Abstract:

The concept of community building started in 1994 in Taiwan. After years of development, it fostered the notion of active local resident participation in community issues as co-operators, instead of minions. Participatory design gives participants more control in the decision-making process, helps to reduce the friction caused by arguments and assists in bringing different parties to consensus. This results in an increase in the efficiency of projects run in the community. Therefore, the participation of local residents is key to the success of community building. This study applied participatory design to develop plans for the reuse of deserted spaces in the community from the first stage of brainstorming for design ideas, making creative models to be employed later, through to the final stage of construction. After conducting a series of participatory designed activities, it aimed to integrate the different opinions of residents, develop a sense of belonging and reach a consensus. Besides this, it also aimed at building the residents’ awareness of their responsibilities for the environment and related issues of sustainable development. By reviewing relevant literature and understanding the history of related studies, the study formulated a theory. It took the “2012-2014 Changhua County Community Planner Counseling Program” as a case study to investigate the implementation process of participatory design. Research data are collected by document analysis, participants’ observation and in-depth interviews. After examining the three elements of “Design Participation”, “Construction Participation”, and” Follow–up Maintenance Participation” in the case, the study emerged with a promising conclusion: Maintenance works were carried out better compared to common public works. Besides this, maintenance costs were lower. Moreover, the works that residents were involved in were more creative. Most importantly, the community characteristics could be easy be recognized.

Keywords: participatory design, deserted space, community building, reuse

Procedia PDF Downloads 350
32897 Development of an in vitro Fermentation Chicken Ileum Microbiota Model

Authors: Bello Gonzalez, Setten Van M., Brouwer M.

Abstract:

The chicken small intestine represents a dynamic and complex organ in which the enzymatic digestion and absorption of nutrients take place. The development of an in vitro fermentation chicken small intestinal model could be used as an alternative to explore the interaction between the microbiota and nutrient metabolism and to enhance the efficacy of targeting interventions to improve animal health. In the present study we have developed an in vitro fermentation chicken ileum microbiota model for unrevealing the complex interaction of ileum microbial community under physiological conditions. A two-vessel continuous fermentation process simulating in real-time the physiological conditions of the ileum content (pH, temperature, microaerophilic/anoxic conditions, and peristaltic movements) has been standardized as a proof of concept. As inoculum, we use a pool of ileum microbial community obtained from chicken broilers at the age of day 14. The development and validation of the model provide insight into the initial characterization of the ileum microbial community and its dynamics over time-related to nutrient assimilation and fermentation. Samples can be collected at different time points and can be used to determine the microbial compositional structure, dynamics, and diversity over time. The results of studies using this in vitro model will serve as the foundation for the development of a whole small intestine in vitro fermentation chicken gastrointestinal model to complement our already established in vitro fermentation chicken caeca model. The insight gained from this model could provide us with some information about the nutritional strategies to restore and maintain chicken gut homeostasis. Moreover, the in vitro fermentation model will also allow us to study relationships between gut microbiota composition and its dynamics over time associated with nutrients, antimicrobial compounds, and disease modelling.

Keywords: broilers, in vitro model, ileum microbiota, fermentation

Procedia PDF Downloads 32
32896 Argos-Linked Fastloc GPS Reveals the Resting Activity of Migrating Sea Turtles

Authors: Gail Schofield, Antoine M. Dujon, Nicole Esteban, Rebecca M. Lester, Graeme C. Hays

Abstract:

Variation in diel movement patterns during migration provides information on the strategies used by animals to maximize energy efficiency and ensure the successful completion of migration. For instance, many flying and land-based terrestrial species stop to rest and refuel at regular intervals along the migratory route, or at transitory ‘stopover’ sites, depending on resource availability. However, in cases where stopping is not possible (such as over–or through deep–open oceans, or over deserts and mountains), non-stop travel is required, with animals needing to develop strategies to rest while actively traveling. Recent advances in biologging technologies have identified mid-flight micro sleeps by swifts in Africa during the 10-month non-breeding period, and the use of lateralized sleep behavior in orca and bottlenose dolphins during migration. Here, highly accurate locations obtained by Argos-linked Fastloc-GPS transmitters of adult green (n=8 turtles, 9487 locations) and loggerhead (n=46 turtles, 47,588 locations) sea turtles migrating around thousand kilometers (over several weeks) from breeding to foraging grounds across the Indian and Mediterranean oceans were used to identify potential resting strategies. Stopovers were only documented for seven turtles, lasting up to 6 days; thus, this strategy was not commonly used, possibly due to the lack of potential ‘shallow’ ( < 100 m seabed depth) sites along routes. However, observations of the day versus night speed of travel indicated that turtles might use other mechanisms to rest. For instance, turtles traveled an average 31% slower at night compared to day during oceanic crossings. Slower travel speeds at night might be explained by turtles swimming in a less direct line at night and/or deeper dives reducing their forward motion, as indicated through studies using Argos-linked transmitters and accelerometers. Furthermore, within the first 24 h of entering waters shallower than 100 m towards the end of migration (the depth at which sea turtles can swim and rest on the seabed), some individuals travelled 72% slower at night, repeating this behavior intermittently (each time for a one-night duration at 3–6-day intervals) until reaching the foraging grounds. If the turtles were, in fact, resting on the seabed at this point, they could be inactive for up to 8-hours, facilitating protracted periods of rest after several weeks of constant swimming. Turtles might not rest every night once within these shallower depths, due to the time constraints of reaching foraging grounds and restoring depleted energetic reserves (as sea turtles are capital breeders, they tend not to feed for several months during migration to and from the breeding grounds and while breeding). In conclusion, access to data-rich, highly accurate Argos-linked Fastloc-GPS provided information about differences in the day versus night activity at different stages of migration, allowing us, for the first time, to compare the strategies used by a marine vertebrate with terrestrial land-based and flying species. However, the question of what resting strategies are used by individuals that remain in oceanic waters to forage, with combinations of highly accurate Argos-linked Fastloc-GPS transmitters and accelerometry or time-depth recorders being required for sufficient numbers of individuals.

Keywords: argos-linked fastloc GPS, data loggers, migration, resting strategy, telemetry

Procedia PDF Downloads 137
32895 Hedgerow Detection and Characterization Using Very High Spatial Resolution SAR DATA

Authors: Saeid Gharechelou, Stuart Green, Fiona Cawkwell

Abstract:

Hedgerow has an important role for a wide range of ecological habitats, landscape, agriculture management, carbon sequestration, wood production. Hedgerow detection accurately using satellite imagery is a challenging problem in remote sensing techniques, because in the special approach it is very similar to line object like a road, from a spectral viewpoint, a hedge is very similar to a forest. Remote sensors with very high spatial resolution (VHR) recently enable the automatic detection of hedges by the acquisition of images with enough spectral and spatial resolution. Indeed, recently VHR remote sensing data provided the opportunity to detect the hedgerow as line feature but still remain difficulties in monitoring the characterization in landscape scale. In this research is used the TerraSAR-x Spotlight and Staring mode with 3-5 m resolution in wet and dry season in the test site of Fermoy County, Ireland to detect the hedgerow by acquisition time of 2014-2015. Both dual polarization of Spotlight data in HH/VV is using for detection of hedgerow. The varied method of SAR image technique with try and error way by integration of classification algorithm like texture analysis, support vector machine, k-means and random forest are using to detect hedgerow and its characterization. We are applying the Shannon entropy (ShE) and backscattering analysis in single and double bounce in polarimetric analysis for processing the object-oriented classification and finally extracting the hedgerow network. The result still is in progress and need to apply the other method as well to find the best method in study area. Finally, this research is under way to ahead to get the best result and here just present the preliminary work that polarimetric image of TSX potentially can detect the hedgerow.

Keywords: TerraSAR-X, hedgerow detection, high resolution SAR image, dual polarization, polarimetric analysis

Procedia PDF Downloads 222
32894 Minimization of Denial of Services Attacks in Vehicular Adhoc Networking by Applying Different Constraints

Authors: Amjad Khan

Abstract:

The security of Vehicular ad hoc networking is of great importance as it involves serious life threats. Thus to provide secure communication amongst Vehicles on road, the conventional security system is not enough. It is necessary to prevent the network resources from wastage and give them protection against malicious nodes so that to ensure the data bandwidth availability to the legitimate nodes of the network. This work is related to provide a non conventional security system by introducing some constraints to minimize the DoS (Denial of services) especially data and bandwidth. The data packets received by a node in the network will pass through a number of tests and if any of the test fails, the node will drop those data packets and will not forward it anymore. Also if a node claims to be the nearest node for forwarding emergency messages then the sender can effectively identify the true or false status of the claim by using these constraints. Consequently the DoS(Denial of Services) attack is minimized by the instant availability of data without wasting the network resources.

Keywords: black hole attack, grey hole attack, intransient traffic tempering, networking

Procedia PDF Downloads 269
32893 Restoration and Conservation of Historical Textiles Using Covalently Immobilized Enzymes on Nanoparticles

Authors: Mohamed Elbehery

Abstract:

Historical textiles in the burial environment or in museums are exposed to many types of stains and dirt that are associated with historical textiles by multiple chemical bonds that cause damage to historical textiles. The cleaning process must be carried out with great care, with no irreversible damage, and sediments removed without affecting the original material of the surface being cleaned. Science and technology continue to provide innovative systems in the bio-cleaning process (using pure enzymes) of historical textiles and artistic surfaces. Lipase and α-amylase were immobilized on nanoparticles of alginate/κ-carrageenan nanoparticle complex and used in historical textiles cleaning. Preparation of nanoparticles, activation, and enzymes immobilization were characterized. Optimization of loading time and units of the two enzymes were done. It was found that, the optimum time and units of amylase were 4 hrs and 25U, respectively. While, the optimum time and units of lipase were 3 hrs and 15U, respectively. The methods used to examine the fibers using a scanning electron microscope equipped with an X-ray energy dispersal unit: SEM with EDX unit.

Keywords: nanoparticles, enzymes, immobilization, textiles

Procedia PDF Downloads 83
32892 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya

Authors: Abdalla Abdelnabi, Yousf Abushalah

Abstract:

The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.

Keywords: 3D seismic data, well logging, petrel, kingdom suite

Procedia PDF Downloads 138
32891 A New Approach for Generalized First Derivative of Nonsmooth Functions Using Optimization

Authors: Mohammad Mehdi Mazarei, Ali Asghar Behroozpoor

Abstract:

In this paper, we define an optimization problem corresponding to smooth and nonsmooth functions which its optimal solution is the first derivative of these functions in a domain. For this purpose, a linear programming problem corresponding to optimization problem is obtained. The optimal solution of this linear programming problem is the approximate generalized first derivative. In fact, we approximate generalized first derivative of nonsmooth functions as tailor series. We show the efficiency of our approach by some smooth and nonsmooth functions in some examples.

Keywords: general derivative, linear programming, optimization problem, smooth and nonsmooth functions

Procedia PDF Downloads 541