Search results for: decentralized data platform
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26131

Search results for: decentralized data platform

22171 Multi-Class Text Classification Using Ensembles of Classifiers

Authors: Syed Basit Ali Shah Bukhari, Yan Qiang, Saad Abdul Rauf, Syed Saqlaina Bukhari

Abstract:

Text Classification is the methodology to classify any given text into the respective category from a given set of categories. It is highly important and vital to use proper set of pre-processing , feature selection and classification techniques to achieve this purpose. In this paper we have used different ensemble techniques along with variance in feature selection parameters to see the change in overall accuracy of the result and also on some other individual class based features which include precision value of each individual category of the text. After subjecting our data through pre-processing and feature selection techniques , different individual classifiers were tested first and after that classifiers were combined to form ensembles to increase their accuracy. Later we also studied the impact of decreasing the classification categories on over all accuracy of data. Text classification is highly used in sentiment analysis on social media sites such as twitter for realizing people’s opinions about any cause or it is also used to analyze customer’s reviews about certain products or services. Opinion mining is a vital task in data mining and text categorization is a back-bone to opinion mining.

Keywords: Natural Language Processing, Ensemble Classifier, Bagging Classifier, AdaBoost

Procedia PDF Downloads 227
22170 Investigating the Relationship between Growth, Beta and Liquidity

Authors: Zahra Amirhosseini, Mahtab Nameni

Abstract:

The aim of this study was to investigate the relationship between growth, beta, and Company's cash. We calculate cash as dependent variable and growth opportunity and beta as independent variables. This study was based on an analysis of panel data. Population of the study is the companies which listed in Tehran Stock exchange and a financial data of 215 companies during the period 2010 to 2015 have been selected as the sample through systematic sampling. The results of the first hypothesis showed there is a significant relationship between growth opportunities cash holdings. Also according to the analysis done in the second hypothesis, we determined that there is an inverse relation between company risk and cash holdings.

Keywords: growth, beta, liquidity, company

Procedia PDF Downloads 391
22169 Rejuvenating a Space into World Class Environment through Conservation of Heritage Architecture

Authors: Abhimanyu Sharma

Abstract:

India is known for its cultural heritage. As the country is rich in diversity along its length and breadth, the state of Jammu & Kashmir is world famous for the beautiful tourist destinations in the Kashmir region of the state. However, equally destined destinations are also located in Jammu region of the said state. For most of the time in last 50-60 years, the prime focus of development was centered around Kashmir region. But now due to an ever increase in globalization, the focus is decentralizing throughout the country. Pertinently, the potential of Jammu Region needs to be incorporated into the world tourist map in particular. One such spot in the Jammu region of the state is a place called ‘Mubarak Mandi’ – the palace with the royal residence of the Maharaja of Jammu & Kashmir from the Dogra Dynasty, is located in the heart of Jammu city (the winter capital of the state). Since the place is destined with a heritage importance but yet lack the supporting infrastructure to attract the national tourist in general and worldwide tourist at large. For such places, conservation and restoration of the existing structures are the potential tools to overcome the present limiting nature of the place. The rejuvenation of this place through potential and dynamic conservation techniques is targeted through this paper. This paper deals with developing and restoring the areas within the whole campus with appropriate building materials, conservation techniques, etc. to promote a great number of visitors by developing it into a prioritised tourist attraction point. Major thrust shall be on studying the criteria’s for developing the place considering the psychological effect needed to create a socially interactive environment. Additionally, thrust shall be on the spatial elements that will aid in creating a common platform for all kinds of tourists. Accordingly, different conservation guidelines (or model) shall be targeted through this paper so that this Jammu region shall also be an equally contributor to the tourist graph of the country as the Kashmir part is.

Keywords: conservation, heritage architecture, rejuvenating, restoration

Procedia PDF Downloads 293
22168 Social Media Mining with R. Twitter Analyses

Authors: Diana Codat

Abstract:

Tweets' analysis is part of text mining. Each document is a written text. It's possible to apply the usual text search techniques, in particular by switching to the bag-of-words representation. But the tweets induce peculiarities. Some may enrich the analysis. Thus, their length is calibrated (at least as far as public messages are concerned), special characters make it possible to identify authors (@) and themes (#), the tweet and retweet mechanisms make it possible to follow the diffusion of the information. Conversely, other characteristics may disrupt the analyzes. Because space is limited, authors often use abbreviations, emoticons to express feelings, and they do not pay much attention to spelling. All this creates noise that can complicate the task. The tweets carry a lot of potentially interesting information. Their exploitation is one of the main axes of the analysis of the social networks. We show how to access Twitter-related messages. We will initiate a study of the properties of the tweets, and we will follow up on the exploitation of the content of the messages. We will work under R with the package 'twitteR'. The study of tweets is a strong focus of analysis of social networks because Twitter has become an important vector of communication. This example shows that it is easy to initiate an analysis from data extracted directly online. The data preparation phase is of great importance.

Keywords: data mining, language R, social networks, Twitter

Procedia PDF Downloads 179
22167 Study on Seismic Assessment of Earthquake-Damaged Reinforced Concrete Buildings

Authors: Fu-Pei Hsiao, Fung-Chung Tu, Chien-Kuo Chiu

Abstract:

In this work, to develop a method for detailed assesses of post-earthquake seismic performance for RC buildings in Taiwan, experimental data for several column specimens with various failure modes (flexural failure, flexural-shear failure, and shear failure) are used to derive reduction factors of seismic capacity for specified damage states. According to the damage states of RC columns and their corresponding seismic reduction factors suggested by experimental data, this work applies the detailed seismic performance assessment method to identify the seismic capacity of earthquake-damaged RC buildings. Additionally, a post-earthquake emergent assessment procedure is proposed that can provide the data needed for decision about earthquake-damaged buildings in a region with high seismic hazard. Finally, three actual earthquake-damaged school buildings in Taiwan are used as a case study to demonstrate application of the proposed assessment method.

Keywords: seismic assessment, seismic reduction factor, residual seismic ratio, post-earthquake, reinforced concrete, building

Procedia PDF Downloads 395
22166 Deep Learning for Recommender System: Principles, Methods and Evaluation

Authors: Basiliyos Tilahun Betru, Charles Awono Onana, Bernabe Batchakui

Abstract:

Recommender systems have become increasingly popular in recent years, and are utilized in numerous areas. Nowadays many web services provide several information for users and recommender systems have been developed as critical element of these web applications to predict choice of preference and provide significant recommendations. With the help of the advantage of deep learning in modeling different types of data and due to the dynamic change of user preference, building a deep model can better understand users demand and further improve quality of recommendation. In this paper, deep neural network models for recommender system are evaluated. Most of deep neural network models in recommender system focus on the classical collaborative filtering user-item setting. Deep learning models demonstrated high level features of complex data can be learned instead of using metadata which can significantly improve accuracy of recommendation. Even though deep learning poses a great impact in various areas, applying the model to a recommender system have not been fully exploited and still a lot of improvements can be done both in collaborative and content-based approach while considering different contextual factors.

Keywords: big data, decision making, deep learning, recommender system

Procedia PDF Downloads 471
22165 Performance Analysis of Scalable Secure Multicasting in Social Networking

Authors: R. Venkatesan, A. Sabari

Abstract:

Developments of social networking internet scenario are recommended for the requirements of scalable, authentic, secure group communication model like multicasting. Multicasting is an inter network service that offers efficient delivery of data from a source to multiple destinations. Even though multicast has been very successful at providing an efficient and best-effort data delivery service for huge groups, it verified complex process to expand other features to multicast in a scalable way. Separately, the requirement for secure electronic information had become gradually more apparent. Since multicast applications are deployed for mainstream purpose the need to secure multicast communications will become significant.

Keywords: multicasting, scalability, security, social network

Procedia PDF Downloads 288
22164 Vulnerability Risk Assessment of Non-Engineered Houses Based on Damage Data of the 2009 Padang Earthquake 2009 in Padang City, Indonesia

Authors: Rusnardi Rahmat Putra, Junji Kiyono, Aiko Furukawa

Abstract:

Several powerful earthquakes have struck Padang during recent years, one of the largest of which was an M 7.6 event that occurred on September 30, 2009 and caused more than 1000 casualties. Following the event, we conducted a 12-site microtremor array investigation to gain a representative determination of the soil condition of subsurface structures in Padang. From the dispersion curve of array observations, the central business district of Padang corresponds to relatively soft soil condition with Vs30 less than 400 m/s. because only one accelerometer existed, we simulated the 2009 Padang earthquake to obtain peak ground acceleration for all sites in Padang city. By considering the damage data of the 2009 Padang earthquake, we produced seismic risk vulnerability estimation of non-engineered houses for rock, medium and soft soil condition. We estimated the loss ratio based on the ground response, seismic hazard of Padang and the existing damaged to non-engineered structure houses due to Padang earthquake in 2009 data for several return periods of earthquake events.

Keywords: profile, Padang earthquake, microtremor array, seismic vulnerability

Procedia PDF Downloads 404
22163 Hybrid Data-Driven Drilling Rate of Penetration Optimization Scheme Guided by Geological Formation and Historical Data

Authors: Ammar Alali, Mahmoud Abughaban, William Contreras Otalvora

Abstract:

Optimizing the drilling process for cost and efficiency requires the optimization of the rate of penetration (ROP). ROP is the measurement of the speed at which the wellbore is created, in units of feet per hour. It is the primary indicator of measuring drilling efficiency. Maximization of the ROP can indicate fast and cost-efficient drilling operations; however, high ROPs may induce unintended events, which may lead to nonproductive time (NPT) and higher net costs. The proposed ROP optimization solution is a hybrid, data-driven system that aims to improve the drilling process, maximize the ROP, and minimize NPT. The system consists of two phases: (1) utilizing existing geological and drilling data to train the model prior, and (2) real-time adjustments of the controllable dynamic drilling parameters [weight on bit (WOB), rotary speed (RPM), and pump flow rate (GPM)] that direct influence on the ROP. During the first phase of the system, geological and historical drilling data are aggregated. After, the top-rated wells, as a function of high instance ROP, are distinguished. Those wells are filtered based on NPT incidents, and a cross-plot is generated for the controllable dynamic drilling parameters per ROP value. Subsequently, the parameter values (WOB, GPM, RPM) are calculated as a conditioned mean based on physical distance, following Inverse Distance Weighting (IDW) interpolation methodology. The first phase is concluded by producing a model of drilling best practices from the offset wells, prioritizing the optimum ROP value. This phase is performed before the commencing of drilling. Starting with the model produced in phase one, the second phase runs an automated drill-off test, delivering live adjustments in real-time. Those adjustments are made by directing the driller to deviate two of the controllable parameters (WOB and RPM) by a small percentage (0-5%), following the Constrained Random Search (CRS) methodology. These minor incremental variations will reveal new drilling conditions, not explored before through offset wells. The data is then consolidated into a heat-map, as a function of ROP. A more optimum ROP performance is identified through the heat-map and amended in the model. The validation process involved the selection of a planned well in an onshore oil field with hundreds of offset wells. The first phase model was built by utilizing the data points from the top-performing historical wells (20 wells). The model allows drillers to enhance decision-making by leveraging existing data and blending it with live data in real-time. An empirical relationship between controllable dynamic parameters and ROP was derived using Artificial Neural Networks (ANN). The adjustments resulted in improved ROP efficiency by over 20%, translating to at least 10% saving in drilling costs. The novelty of the proposed system lays is its ability to integrate historical data, calibrate based geological formations, and run real-time global optimization through CRS. Those factors position the system to work for any newly drilled well in a developing field event.

Keywords: drilling optimization, geological formations, machine learning, rate of penetration

Procedia PDF Downloads 127
22162 Social Networking Application: What Is Their Quality and How Can They Be Adopted in Open Distance Learning Environments?

Authors: Asteria Nsamba

Abstract:

Social networking applications and tools have become compelling platforms for generating and sharing knowledge across the world. Social networking applications and tools refer to a variety of social media platforms which include Facebook, Twitter WhatsApp, blogs and Wikis. The most popular of these platforms are Facebook, with 2.41 billion active users on a monthly basis, followed by WhatsApp with 1.6 billion users and Twitter with 330 million users. These communication platforms have not only impacted social lives but have also impacted students’ learning, across different delivery modes in higher education: distance, conventional and blended learning modes. With this amount of interest in these platforms, knowledge sharing has gained importance within the context in which it is required. In open distance learning (ODL) contexts, social networking platforms can offer students and teachers the platform on which to create and share knowledge, and form learning collaborations. Thus, they can serve as support mechanisms to increase interactions and reduce isolation and loneliness inherent in ODL. Despite this potential and opportunity, research indicates that many ODL teachers are not inclined to using social media tools in learning. Although it is unclear why these tools are uncommon in these environments, concerns raised in the literature have indicated that many teachers have not mastered the art of teaching with technology. Using technological, pedagogical content knowledge (TPCK) and product quality theory, and Bloom’s Taxonomy as lenses, this paper is aimed at; firstly, assessing the quality of three social media applications: Facebook, Twitter and WhatsApp, in order to determine the extent to which they are suitable platforms for teaching and learning, in terms of content generation, information sharing and learning collaborations. Secondly, the paper demonstrates the application of teaching, learning and assessment using Bloom’s Taxonomy.

Keywords: distance education, quality, social networking tools, TPACK

Procedia PDF Downloads 119
22161 Geometric Imperfections in Lattice Structures: A Simulation Strategy to Predict Strength Variability

Authors: Xavier Lorang, Ahmadali Tahmasebimoradi, Chetra Mang, Sylvain Girard

Abstract:

The additive manufacturing processes (e.g. selective laser melting) allow us to produce lattice structures which have less weight, higher impact absorption capacity, and better thermal exchange property compared to the classical structures. Unfortunately, geometric imperfections (defects) in the lattice structures are by-products results of the manufacturing process. These imperfections decrease the lifetime and the strength of the lattice structures and alternate their mechanical responses. The objective of the paper is to present a simulation strategy which allows us to take into account the effect of the geometric imperfections on the mechanical response of the lattice structure. In the first part, an identification method of geometric imperfection parameters of the lattice structure based on point clouds is presented. These point clouds are based on tomography measurements. The point clouds are fed into the platform LATANA (LATtice ANAlysis) developed by IRT-SystemX to characterize the geometric imperfections. This is done by projecting the point clouds of each microbeam along the beam axis onto a 2D surface. Then, by fitting an ellipse to the 2D projections of the points, the geometric imperfections are characterized by introducing three parameters of an ellipse; semi-major/minor axes and angle of rotation. With regard to the calculated parameters of the microbeam geometric imperfections, a statistical analysis is carried out to determine a probability density law based on a statistical hypothesis. The microbeam samples are randomly drawn from the density law and are used to generate lattice structures. In the second part, a finite element model for the lattice structure with the simplified geometric imperfections (ellipse parameters) is presented. This numerical model is used to simulate the generated lattice structures. The propagation of the uncertainties of geometric imperfections is shown through the distribution of the computed mechanical responses of the lattice structures.

Keywords: additive manufacturing, finite element model, geometric imperfections, lattice structures, propagation of uncertainty

Procedia PDF Downloads 180
22160 A Sub-Scalar Approach to the MIPS Architecture

Authors: Kumar Sambhav Pandey, Anamika Singh

Abstract:

The continuous researches in the field of computer architecture basically aims at accelerating the computational speed and to gain enhanced performance. In this era, the superscalar, sub-scalar concept has not gained enough attention for improving the computation performance. In this paper, we have presented a sub-scalar approach to utilize the parallelism present with in the data while processing. The main idea is to split the data into individual smaller entities and these entities are processed with a defined known set of instructions. This sub-scalar approach to the MIPS architecture can bring out significant improvement in the computational speedup. MIPS-I is the basic design taken in consideration for the development of sub-scalar MIPS64 for increasing the instruction level parallelism (ILP) and resource utilization.

Keywords: dataword, MIPS, processor, sub-scalar

Procedia PDF Downloads 541
22159 Mobile-Assisted Language Learning (MALL) Applications for Interactive and Engaging Classrooms: APPsolutely!

Authors: Ajda Osifo, Amanda Radwan

Abstract:

Mobile-assisted language learning (MALL) or m-learning which is defined as learning with mobile devices that can be utilized in any place that is equipped with unbroken transmission signals, has created new opportunities and challenges for educational use. It introduced a new learning model combining new types of mobile devices, wireless communication services and technologies with teaching and learning. Recent advancements in the mobile world such as the Apple IOS devices (IPhone, IPod Touch and IPad), Android devices and other smartphone devices and environments (such as Windows Phone 7 and Blackberry), allowed learning to be more flexible inside and outside the classroom, making the learning experience unique, adaptable and tailored to each user. Creativity, learner autonomy, collaboration and digital practices of language learners are encouraged as well as innovative pedagogical applications, like the flipped classroom, for such practices in classroom contexts are enhanced. These developments are gradually embedded in daily life and they also seem to be heralding the sustainable move to paperless classrooms. Since mobile technologies are increasingly viewed as a main platform for delivery, we as educators need to design our activities, materials and learning environments in such a way to ensure that learners are engaged and feel comfortable. For the purposes of our session, several core MALL applications that work on the Apple IPad/IPhone will be explored; the rationale and steps needed to successfully implement these applications will be discussed and student examples will be showcased. The focus of the session will be on the following points: 1-Our current pedagogical approach, 2-The rationale and several core MALL apps, 3-Possible Challenges for Teachers and Learners, 4-Future implications. This session is aimed at instructors who are interested in integrating MALL apps into their own classroom planning.

Keywords: MALL, educational technology, iPads, apps

Procedia PDF Downloads 388
22158 Association of ApoB, CETP and GALNT2 Genetic Variants with Type 2 Diabetes-Related Traits in Population from Bosnia and Herzegovina

Authors: Anida Causevic-Ramosevac, Sabina Semiz

Abstract:

The aim of this study was to investigate the association of four single nucleotide polymorphisms (SNPs) - rs673548, rs693 in ApoB gene, rs1800775 in CETP gene and rs4846914 in GALNT2 gene with parameters of type 2 diabetes (T2D) and diabetic dyslipidemia in the population of Bosnia and Herzegovina (BH). Materials and methods: Our study involved 352 patients with T2D and 156 healthy subjects. Biochemical and anthropometric parameters were measured in all participants. DNA was extracted from the peripheral blood for the purpose of genetic testing. Polymorphisms in ApoB (rs673548, rs693), CETP (rs1800775) and GALNT2 (rs4846914) genes were analyzed by using Sequenom IPLEX platform. Results: Our results demonstrated significant associations for rs180075 polymorphism in CETP gene with levels of fasting insulin (p = 0.020; p = 0.027; p = 0.044), triglycerides (p = 0.046) and ALT (p = 0.031) activity in control group. In group of diabetic patients, results showed a significant association of rs673548 in ApoB gene with levels of fasting insulin (p = 0.008), HOMA-IR (p = 0.013), VLDL-C (p = 0.037) and CRP (p = 0.029) and rs693 in ApoB gene with BMI (p = 0.025), systolic blood pressure (p = 0.027), fasting insulin (p = 0.037) and HOMA-IR (p = 0.023) levels. Significant associations were also observed for rs1800775 in CETP gene with triglyceride (p = 0.023) levels and rs4846914 in GALNT2 gene with HbA1C (p = 0.013) and triglyceride (p = 0.043) levels. Conclusion: In conclusion, this is the first study that examined the impact of variations of candidate genes on a wide range of metabolic parameters in BH population. Our results suggest an association of variations of ApoB, CETP and GALNT2 genes with specific markers of T2D and dyslipidemia. Further studies would be needed in order to confirm these genetic effects in other ethnic groups as well.

Keywords: ApoB, CETP, dyslipidemia, GALNT2, type 2 diabetes

Procedia PDF Downloads 240
22157 Disagreement in Spousal Report of Current Contraceptive Use in India and Its Determinants

Authors: Dipti Govil, Nidhi Khosla

Abstract:

Couple-level reports of contraception are important as wives and husbands may give different reports about contraceptive use. Using matched couple-data (N=62910), from India's NFHS–IV (2015-16), this paper examines concordance in spousal reports of current contraceptive use and its differentials. Reporting of contraceptive use was higher among wives (59%) than husbands (25%). Concordance was low; 16.5% of couples reported the use of the same method, while 21% reported the use of any method. There existed a huge denial from husbands on the use of female sterilization. Reconstruction of contraceptive use among men increased concordance by 10%. Multivariate analysis shows that concordance was low in urban and Southern India, among younger women and women with lower wealth-index. Men's control over household decision-making and negative attitudes towards contraception were associated with a lower concordance. Findings highlight the importance of using couple-level data to estimate contraceptive prevalence, the role of education programs to inculcate positive attitudes towards contraception, fostering gender equality, and involving men into family planning efforts. The results also raise the issue of data quality as the questions were asked differently from men and women, which might have contributed to wide discordance.

Keywords: concordance, contraceptive use, couple, female sterilisation, India

Procedia PDF Downloads 125
22156 Towards the Production of Least Contaminant Grade Biosolids and Biochar via Mild Acid Pre-treatment

Authors: Ibrahim Hakeem

Abstract:

Biosolids are stabilised sewage sludge produced from wastewater treatment processes. Biosolids contain valuable plant nutrient which facilitates their beneficial reuse in agricultural land. However, the increasing levels of legacy and emerging contaminants such as heavy metals (HMs), PFAS, microplastics, pharmaceuticals, microbial pathogens etc., are restraining the direct land application of biosolids. Pyrolysis of biosolids can effectively degrade microbial and organic contaminants; however, HMs remain a persistent problem with biosolids and their pyrolysis-derived biochar. In this work, we demonstrated the integrated processing of biosolids involving the acid pre-treatment for HMs removal and selective reduction of ash-forming elements followed by the bench-scale pyrolysis of the treated biosolids to produce quality biochar and bio-oil enriched with valuable platform chemicals. The pre-treatment of biosolids using 3% v/v H₂SO₄ at room conditions for 30 min reduced the ash content from 30 wt% in raw biosolids to 15 wt% in the treated sample while removing about 80% of limiting HMs without degrading the organic matter. The preservation of nutrients and reduction of HMs concentration and mobility via the developed hydrometallurgical process improved the grade of the treated biosolids for beneficial land reuse. The co-removal of ash-forming elements from biosolids positively enhanced the fluidised bed pyrolysis of the acid-treated biosolids at 700 ℃. Organic matter devolatilisation was improved by 40%, and the produced biochar had higher surface area (107 m²/g), heating value (15 MJ/kg), fixed carbon (35 wt%), organic carbon retention (66% dry-ash free) compared to the raw biosolids biochar with surface area (56 m²/g), heating value (9 MJ/kg), fixed carbon (20 wt%) and organic carbon retention (50%). Pre-treatment also improved microporous structure development of the biochar and substantially decreased the HMs concentration and bioavailability by at least 50% relative to the raw biosolids biochar. The integrated process is a viable approach to enhancing value recovery from biosolids.

Keywords: biosolids, pyrolysis, biochar, heavy metals

Procedia PDF Downloads 73
22155 Cloud Computing in Jordanian Libraries: An Overview

Authors: Mohammad A. Al-Madi, Nagham A. Al-Madi, Fanan A. Al-Madi

Abstract:

The current concept of the technology of cloud computing libraries has been increasing where users can store their data in a virtual space and can be retrieved from anywhere whilst using the network. By using cloud computing technology, industries and individuals save money, time, and space. Moreover, data and information about libraries can be placed in the cloud. This paper discusses the meaning of cloud computing along with its types. Further, the focus has been given to the application of cloud computing in modern libraries. Additionally, the advantages of cloud computing and the areas in which cloud computing be applied with current usage are discussed. Finally, the present situation of the Jordanian libraries is considered and discussed in further detail.

Keywords: cloud computing, community cloud, hybrid cloud, private cloud, public cloud

Procedia PDF Downloads 214
22154 A New Approach to Achieve the Regime Equations in Sand-Bed Rivers

Authors: Farhad Imanshoar

Abstract:

The regime or equilibrium geometry of alluvial rivers remains a topic of fundamental scientific and engineering interest. There are several approaches to analyze the problem, namely: empirical formulas, semi-theoretical methods and rational (extreme) procedures. However, none of them is widely accepted at present, due to lack of knowledge of some physical processes associated with channel formation and the simplification hypotheses imposed in order to reduce the high quantity of involved variables. The study presented in this paper shows a new approach to estimate stable width and depth of sand-bed rivers by using developed stream power equation (DSPE). At first, a new procedure based on theoretical analysis and by considering DSPE and ultimate sediment concentration were developed. Then, experimental data for regime condition in sand-bed rivers (flow depth, flow width, sediment feed rate for several cases) were gathered. Finally, the results of this research (regime equations) are compared with the field data and other regime equations. A good agreement was observed between the field data and the values resulted from developed regime equation.

Keywords: regime equations, developed stream power equation, sand-bed rivers, semi-theoretical methods

Procedia PDF Downloads 264
22153 RFID Logistic Management with Cold Chain Monitoring: Cold Store Case Study

Authors: Mira Trebar

Abstract:

Logistics processes of perishable food in the supply chain include the distribution activities and the real time temperature monitoring to fulfil the cold chain requirements. The paper presents the use of RFID (Radio Frequency Identification) technology as an identification tool of receiving and shipping activities in the cold store. At the same time, the use of RFID data loggers with temperature sensors is presented to observe and store the temperatures for the purpose of analyzing the processes and having the history data available for traceability purposes and efficient recall management.

Keywords: logistics, warehouse, RFID device, cold chain

Procedia PDF Downloads 622
22152 Assessing of Social Comfort of the Russian Population with Big Data

Authors: Marina Shakleina, Konstantin Shaklein, Stanislav Yakiro

Abstract:

The digitalization of modern human life over the last decade has facilitated the acquisition, storage, and processing of data, which are used to detect changes in consumer preferences and to improve the internal efficiency of the production process. This emerging trend has attracted academic interest in the use of big data in research. The study focuses on modeling the social comfort of the Russian population for the period 2010-2021 using big data. Big data provides enormous opportunities for understanding human interactions at the scale of society with plenty of space and time dynamics. One of the most popular big data sources is Google Trends. The methodology for assessing social comfort using big data involves several steps: 1. 574 words were selected based on the Harvard IV-4 Dictionary adjusted to fit the reality of everyday Russian life. The set of keywords was further cleansed by excluding queries consisting of verbs and words with several lexical meanings. 2. Search queries were processed to ensure comparability of results: the transformation of data to a 10-point scale, elimination of popularity peaks, detrending, and deseasoning. The proposed methodology for keyword search and Google Trends processing was implemented in the form of a script in the Python programming language. 3. Block and summary integral indicators of social comfort were constructed using the first modified principal component resulting in weighting coefficients values of block components. According to the study, social comfort is described by 12 blocks: ‘health’, ‘education’, ‘social support’, ‘financial situation’, ‘employment’, ‘housing’, ‘ethical norms’, ‘security’, ‘political stability’, ‘leisure’, ‘environment’, ‘infrastructure’. According to the model, the summary integral indicator increased by 54% and was 4.631 points; the average annual rate was 3.6%, which is higher than the rate of economic growth by 2.7 p.p. The value of the indicator describing social comfort in Russia is determined by 26% by ‘social support’, 24% by ‘education’, 12% by ‘infrastructure’, 10% by ‘leisure’, and the remaining 28% by others. Among 25% of the most popular searches, 85% are of negative nature and are mainly related to the blocks ‘security’, ‘political stability’, ‘health’, for example, ‘crime rate’, ‘vulnerability’. Among the 25% most unpopular queries, 99% of the queries were positive and mostly related to the blocks ‘ethical norms’, ‘education’, ‘employment’, for example, ‘social package’, ‘recycling’. In conclusion, the introduction of the latent category ‘social comfort’ into the scientific vocabulary deepens the theory of the quality of life of the population in terms of the study of the involvement of an individual in the society and expanding the subjective aspect of the measurements of various indicators. Integral assessment of social comfort demonstrates the overall picture of the development of the phenomenon over time and space and quantitatively evaluates ongoing socio-economic policy. The application of big data in the assessment of latent categories gives stable results, which opens up possibilities for their practical implementation.

Keywords: big data, Google trends, integral indicator, social comfort

Procedia PDF Downloads 198
22151 Remote Sensing of Aerated Flows at Large Dams: Proof of Concept

Authors: Ahmed El Naggar, Homyan Saleh

Abstract:

Dams are crucial for flood control, water supply, and the creation of hydroelectric power. Every dam has a water conveyance system, such as a spillway, providing the safe discharge of catastrophic floods when necessary. Spillway design has historically been investigated in laboratory research owing to the absence of suitable full-scale flow monitoring equipment and safety problems. Prototype measurements of aerated flows are urgently needed to quantify projected scale effects and provide missing validation data for design guidelines and numerical simulations. In this work, an image-based investigation of free-surface flows on a tiered spillway was undertaken at the laboratory (fixed camera installation) and prototype size (drone video) (drone footage) (drone footage). The drone videos were generated using data from citizen science. Analyses permitted the measurement of the free-surface aeration inception point, air-water surface velocities, fluctuations, and residual energy at the chute's downstream end from a remote site. The prototype observations offered full-scale proof of concept, while laboratory results were efficiently confirmed against invasive phase-detection probe data. This paper stresses the efficacy of image-based analyses at prototype spillways. It highlights how citizen science data may enable academics better understand real-world air-water flow dynamics and offers a framework for a small collection of long-missing prototype data.

Keywords: remote sensing, aerated flows, large dams, proof of concept, dam spillways, air-water flows, prototype operation, remote sensing, inception point, optical flow, turbulence, residual energy

Procedia PDF Downloads 83
22150 Sentiment Mapping through Social Media and Its Implications

Authors: G. C. Joshi, M. Paul, B. K. Kalita, V. Ranga, J. S. Rawat, P. S. Rawat

Abstract:

Being a habitat of the global village, every place has established connection through the strength and power of social media piercing through the political boundaries. Social media is a digital platform, where people across the world can interact as it has advantages of being universal, anonymous, easily accessible, indirect interaction, gathering and sharing information. The power of social media lies in the intensity of sharing extreme opinions or feelings, in contrast to the personal interactions which can be easily mapped in the form of Sentiment Mapping. The easy access to social networking sites such as Facebook, Twitter and blogs made unprecedented opportunities for citizens to voice their opinions loaded with dynamics of emotions. These further influence human thoughts where social media plays a very active role. A recent incident of public importance was selected as a case study to map the sentiments of people through Twitter. Understanding those dynamics through the eye of an ordinary people can be challenging. With the help of R-programming language and by the aid of GIS techniques sentiment maps has been produced. The emotions flowing worldwide in the form of tweets were extracted and analyzed. The number of tweets had diminished by 91 % from 25/08/2017 to 31/08/2017. A boom of sentiments emerged near the origin of the case, i.e., Delhi, Haryana and Punjab and the capital showed maximum influence resulting in spillover effect near Delhi. The trend of sentiments was prevailing more as neutral (45.37%), negative (28.6%) and positive (21.6%) after calculating the sentiment scores of the tweets. The result can be used to know the spatial distribution of digital penetration in India, where highest concentration lies in Mumbai and lowest in North East India and Jammu and Kashmir.

Keywords: sentiment mapping, digital literacy, GIS, R statistical language, spatio-temporal

Procedia PDF Downloads 148
22149 Testing Causal Model of Depression Based on the Components of Subscales Lifestyle with Mediation of Social Health

Authors: Abdolamir Gatezadeh, Jamal Daghaleh

Abstract:

The lifestyle of individuals is important and determinant for the status of psychological and social health. Recently, especially in developed countries, the relationship between lifestyle and mental illnesses, including depression, has attracted the attention of many people. In order to test the causal model of depression based on lifestyle with mediation of social health in the study, basic and applied methods were used in terms of objective and descriptive-field as well as the data collection. Methods: This study is a basic research type and is in the framework of correlational plans. In this study, the population includes all adults in Ahwaz city. A randomized, multistage sampling of 384 subjects was selected as the subjects. Accordingly, the data was collected and analyzed using structural equation modeling. Results: In data analysis, path analysis indicated the confirmation of the assumed model fit of research. This means that subscales lifestyle has a direct effect on depression and subscales lifestyle through the mediation of social health which in turn has an indirect effect on depression. Discussion and conclusion: According to the results of the research, the depression can be used to explain the components of the lifestyle and social health.

Keywords: depression, subscales lifestyle, social health, causal model

Procedia PDF Downloads 160
22148 Landslide Susceptibility Analysis in the St. Lawrence Lowlands Using High Resolution Data and Failure Plane Analysis

Authors: Kevin Potoczny, Katsuichiro Goda

Abstract:

The St. Lawrence lowlands extend from Ottawa to Quebec City and are known for large deposits of sensitive Leda clay. Leda clay deposits are responsible for many large landslides, such as the 1993 Lemieux and 2010 St. Jude (4 fatalities) landslides. Due to the large extent and sensitivity of Leda clay, regional hazard analysis for landslides is an important tool in risk management. A 2018 regional study by Farzam et al. on the susceptibility of Leda clay slopes to landslide hazard uses 1 arc second topographical data. A qualitative method known as Hazus is used to estimate susceptibility by checking for various criteria in a location and determine a susceptibility rating on a scale of 0 (no susceptibility) to 10 (very high susceptibility). These criteria are slope angle, geological group, soil wetness, and distance from waterbodies. Given the flat nature of St. Lawrence lowlands, the current assessment fails to capture local slopes, such as the St. Jude site. Additionally, the data did not allow one to analyze failure planes accurately. This study majorly improves the analysis performed by Farzam et al. in two aspects. First, regional assessment with high resolution data allows for identification of local locations that may have been previously identified as low susceptibility. This then provides the opportunity to conduct a more refined analysis on the failure plane of the slope. Slopes derived from 1 arc second data are relatively gentle (0-10 degrees) across the region; however, the 1- and 2-meter resolution 2022 HRDEM provided by NRCAN shows that short, steep slopes are present. At a regional level, 1 arc second data can underestimate the susceptibility of short, steep slopes, which can be dangerous as Leda clay landslides behave retrogressively and travel upwards into flatter terrain. At the location of the St. Jude landslide, slope differences are significant. 1 arc second data shows a maximum slope of 12.80 degrees and a mean slope of 4.72 degrees, while the HRDEM data shows a maximum slope of 56.67 degrees and a mean slope of 10.72 degrees. This equates to a difference of three susceptibility levels when the soil is dry and one susceptibility level when wet. The use of GIS software is used to create a regional susceptibility map across the St. Lawrence lowlands at 1- and 2-meter resolutions. Failure planes are necessary to differentiate between small and large landslides, which have so far been ignored in regional analysis. Leda clay failures can only retrogress as far as their failure planes, so the regional analysis must be able to transition smoothly into a more robust local analysis. It is expected that slopes within the region, once previously assessed at low susceptibility scores, contain local areas of high susceptibility. The goal is to create opportunities for local failure plane analysis to be undertaken, which has not been possible before. Due to the low resolution of previous regional analyses, any slope near a waterbody could be considered hazardous. However, high-resolution regional analysis would allow for more precise determination of hazard sites.

Keywords: hazus, high-resolution DEM, leda clay, regional analysis, susceptibility

Procedia PDF Downloads 69
22147 Integrative Omics-Portrayal Disentangles Molecular Heterogeneity and Progression Mechanisms of Cancer

Authors: Binder Hans

Abstract:

Cancer is no longer seen as solely a genetic disease where genetic defects such as mutations and copy number variations affect gene regulation and eventually lead to aberrant cell functioning which can be monitored by transcriptome analysis. It has become obvious that epigenetic alterations represent a further important layer of (de-)regulation of gene activity. For example, aberrant DNA methylation is a hallmark of many cancer types, and methylation patterns were successfully used to subtype cancer heterogeneity. Hence, unraveling the interplay between different omics levels such as genome, transcriptome and epigenome is inevitable for a mechanistic understanding of molecular deregulation causing complex diseases such as cancer. This objective requires powerful downstream integrative bioinformatics methods as an essential prerequisite to discover the whole genome mutational, transcriptome and epigenome landscapes of cancer specimen and to discover cancer genesis, progression and heterogeneity. Basic challenges and tasks arise ‘beyond sequencing’ because of the big size of the data, their complexity, the need to search for hidden structures in the data, for knowledge mining to discover biological function and also systems biology conceptual models to deduce developmental interrelations between different cancer states. These tasks are tightly related to cancer biology as an (epi-)genetic disease giving rise to aberrant genomic regulation under micro-environmental control and clonal evolution which leads to heterogeneous cellular states. Machine learning algorithms such as self organizing maps (SOM) represent one interesting option to tackle these bioinformatics tasks. The SOMmethod enables recognizing complex patterns in large-scale data generated by highthroughput omics technologies. It portrays molecular phenotypes by generating individualized, easy to interpret images of the data landscape in combination with comprehensive analysis options. Our image-based, reductionist machine learning methods provide one interesting perspective how to deal with massive data in the discovery of complex diseases, gliomas, melanomas and colon cancer on molecular level. As an important new challenge, we address the combined portrayal of different omics data such as genome-wide genomic, transcriptomic and methylomic ones. The integrative-omics portrayal approach is based on the joint training of the data and it provides separate personalized data portraits for each patient and data type which can be analyzed by visual inspection as one option. The new method enables an integrative genome-wide view on the omics data types and the underlying regulatory modes. It is applied to high and low-grade gliomas and to melanomas where it disentangles transversal and longitudinal molecular heterogeneity in terms of distinct molecular subtypes and progression paths with prognostic impact.

Keywords: integrative bioinformatics, machine learning, molecular mechanisms of cancer, gliomas and melanomas

Procedia PDF Downloads 146
22146 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 465
22145 IRIS An Interactive Video Game for Children with Long-Term Illness in Hospitals

Authors: Ganetsou Evanthia, Koutsikos Emmanouil, Austin Anna Maria

Abstract:

Information technology has long served the needs of individuals for learning and entertainment, but much less for children in sickness. The aim of the proposed online video game is to provide immersive learning opportunities as well as essential social and emotional scenarios for hospital-bound children with long-term illness. Online self-paced courses on chosen school subjects, including specialised software and multisensory assessments, aim at enhancing children’s academic achievement and sense of inclusion, while doctor minigames familiarise and educate young patients on their medical conditions. Online ethical dilemmas will offer children opportunities to contemplate on the importance of medical procedures and following assigned medication, often challenging for young patients; they will therefore reflect on their condition, reevaluate their perceptions about hospitalisation, and assume greater personal responsibility for their progress. Children’s emotional and psychosocial needs are addressed by engaging in social conventions, such as interactive, daily, collaborative mini games with other hospitalised peers, like virtual competitive sports games, weekly group psychodrama sessions, and online birthday parties or sleepovers. Social bonding is also fostered by having a virtual pet to interact with and take care of, as well as a virtual nurse to discuss and reflect on the mood of the day, engage in constructive dialogue and perspective taking, and offer reminders. Access to the platform will be available throughout the day depending on the patient’s health status. The program is designed to minimise escapism and feelings of exclusion, and can flexibly be adapted to offer post-treatment and a support online system at home.

Keywords: long-term illness, children, hospital, interactive games, cognitive, socioemotional development

Procedia PDF Downloads 74
22144 Multi-Criteria Decision Approach to Performance Measurement Techniques Data Envelopment Analysis: Case Study of Kerman City’s Parks

Authors: Ali A. Abdollahi

Abstract:

During the last several decades, scientists have consistently applied Multiple Criteria Decision-Making methods in making decisions about multi-faceted, complicated subjects. While making such decisions and in order to achieve more accurate evaluations, they have regularly used a variety of criteria instead of applying just one Optimum Evaluation Criterion. The method presented here utilizes both ‘quantity’ and ‘quality’ to assess the function of the Multiple-Criteria method. Applying Data envelopment analysis (DEA), weighted aggregated sum product assessment (WASPAS), Weighted Sum Approach (WSA), Analytic Network Process (ANP), and Charnes, Cooper, Rhodes (CCR) methods, we have analyzed thirteen parks in Kerman city. It further indicates that the functions of WASPAS and WSA are compatible with each other, but also that their deviation from DEA is extensive. Finally, the results for the CCR technique do not match the results of the DEA technique. Our study indicates that the ANP method, with the average rate of 1/51, ranks closest to the DEA method, which has an average rate of 1/49.

Keywords: multiple criteria decision making, Data envelopment analysis (DEA), Charnes Cooper Rhodes (CCR), Weighted Sum Approach (WSA)

Procedia PDF Downloads 208
22143 Simulation-Based Optimization Approach for an Electro-Plating Production Process Based on Theory of Constraints and Data Envelopment Analysis

Authors: Mayada Attia Ibrahim

Abstract:

Evaluating and developing the electroplating production process is a key challenge in this type of process. The process is influenced by several factors such as process parameters, process costs, and production environments. Analyzing and optimizing all these factors together requires extensive analytical techniques that are not available in real-case industrial entities. This paper presents a practice-based framework for the evaluation and optimization of some of the crucial factors that affect the costs and production times associated with this type of process, energy costs, material costs, and product flow times. The proposed approach uses Design of Experiments, Discrete-Event Simulation, and Theory of Constraints were respectively used to identify the most significant factors affecting the production process and simulate a real production line to recognize the effect of these factors and assign possible bottlenecks. Several scenarios are generated as corrective strategies for improving the production line. Following that, data envelopment analysis CCR input-oriented DEA model is used to evaluate and optimize the suggested scenarios.

Keywords: electroplating process, simulation, design of experiment, performance optimization, theory of constraints, data envelopment analysis

Procedia PDF Downloads 94
22142 E4D-MP: Time-Lapse Multiphysics Simulation and Joint Inversion Toolset for Large-Scale Subsurface Imaging

Authors: Zhuanfang Fred Zhang, Tim C. Johnson, Yilin Fang, Chris E. Strickland

Abstract:

A variety of geophysical techniques are available to image the opaque subsurface with little or no contact with the soil. It is common to conduct time-lapse surveys of different types for a given site for improved results of subsurface imaging. Regardless of the chosen survey methods, it is often a challenge to process the massive amount of survey data. The currently available software applications are generally based on the one-dimensional assumption for a desktop personal computer. Hence, they are usually incapable of imaging the three-dimensional (3D) processes/variables in the subsurface of reasonable spatial scales; the maximum amount of data that can be inverted simultaneously is often very small due to the capability limitation of personal computers. Presently, high-performance or integrating software that enables real-time integration of multi-process geophysical methods is needed. E4D-MP enables the integration and inversion of time-lapsed large-scale data surveys from geophysical methods. Using the supercomputing capability and parallel computation algorithm, E4D-MP is capable of processing data across vast spatiotemporal scales and in near real time. The main code and the modules of E4D-MP for inverting individual or combined data sets of time-lapse 3D electrical resistivity, spectral induced polarization, and gravity surveys have been developed and demonstrated for sub-surface imaging. E4D-MP provides capability of imaging the processes (e.g., liquid or gas flow, solute transport, cavity development) and subsurface properties (e.g., rock/soil density, conductivity) critical for successful control of environmental engineering related efforts such as environmental remediation, carbon sequestration, geothermal exploration, and mine land reclamation, among others.

Keywords: gravity survey, high-performance computing, sub-surface monitoring, electrical resistivity tomography

Procedia PDF Downloads 151