Search results for: distance based analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 46404

Search results for: distance based analysis

43614 Queer Lesbian Experience within Chinese Girl's Love Manga Fandom: An Qualitative Study of Sexuality among Chinese Yuri Fans

Authors: Ka Yi Yeung

Abstract:

Yuri is a manga culture which refers to the works (manga, literature, TV shows) that depict the intimacy between two girls. It is originally a Japanese culture which then implanted in Chinese fandom after the airing of Maria-sama ga Miteru. There has been a growing fanbase of Yuri culture and most of them are attracted by the subtle and sentimental relationship between girls. The culture is characterized by the spiritual bonding and interactions within girls. A high proportion of female fans in Chinese Yuri community was recorded, and Yamibo forum is their major site for socializing and discussion on Yuri’s work. There is a high tendency that female Yuri fans engaged in a homosexual relationship. However, they seldom directly address themselves as lesbian but non-heterosexual. It is due to the fact that Yuri fans community largely disagrees with the butch-femme role in the mainstream lesbianism. Within Chinese Yuri community, femininity is highly being appreciated. Members with high degree of feminine characteristics are popular among fans community. Besides, since the fans community-based at the online forum, there has been a high tendency that members developed the long-distance relationship. From the in-depth interviews of the research, Yuri fans are mostly pessimistic towards their relationship due to the social and geographical barriers, yet at the same time, they do not lose hope in searching for their true love. This research explored how Chinese Yuri fans challenge the homonormativity in mainstream lesbianism and how they construct their sexual identity through varies discourses on sexuality and homosexual experience.

Keywords: Chinese fandom, femininity, gender, homonormativity, Japanese manga, lesbianism, sexuality, queer culture

Procedia PDF Downloads 385
43613 Modeling and Analysis of Solar Assisted Adsorption Cooling System Using TRNSYS

Authors: M. Wajahat, M. Shoaib, A. Waheed

Abstract:

As a result of increase in world energy demand as well as the demand for heating, refrigeration and air conditioning, energy engineers are now more inclined towards the renewable energy especially solar based thermal driven refrigeration and air conditioning systems. This research is emphasized on solar assisted adsorption refrigeration system to provide comfort conditions for a building in Islamabad. The adsorption chiller can be driven by low grade heat at low temperature range (50 -80 °C) which is lower than that required for generator in absorption refrigeration system which may be furnished with the help of common flat plate solar collectors (FPC). The aim is to offset the total energy required for building’s heating and cooling demand by using FPC’s thus reducing dependency on primary energy source hence saving energy. TRNSYS is a dynamic modeling and simulation tool which can be utilized to simulate the working of a complete solar based adsorption chiller to meet the desired cooling and heating demand during summer and winter seasons, respectively. Modeling and detailed parametric analysis of the whole system is to be carried out to determine the optimal system configuration keeping in view various design constraints. Main focus of the study is on solar thermal loop of the adsorption chiller to reduce the contribution from the auxiliary devices.

Keywords: flat plate collector, energy saving, solar assisted adsorption chiller, TRNSYS

Procedia PDF Downloads 649
43612 Gene Prediction in DNA Sequences Using an Ensemble Algorithm Based on Goertzel Algorithm and Anti-Notch Filter

Authors: Hamidreza Saberkari, Mousa Shamsi, Hossein Ahmadi, Saeed Vaali, , MohammadHossein Sedaaghi

Abstract:

In the recent years, using signal processing tools for accurate identification of the protein coding regions has become a challenge in bioinformatics. Most of the genomic signal processing methods is based on the period-3 characteristics of the nucleoids in DNA strands and consequently, spectral analysis is applied to the numerical sequences of DNA to find the location of periodical components. In this paper, a novel ensemble algorithm for gene selection in DNA sequences has been presented which is based on the combination of Goertzel algorithm and anti-notch filter (ANF). The proposed algorithm has many advantages when compared to other conventional methods. Firstly, it leads to identify the coding protein regions more accurate due to using the Goertzel algorithm which is tuned at the desired frequency. Secondly, faster detection time is achieved. The proposed algorithm is applied on several genes, including genes available in databases BG570 and HMR195 and their results are compared to other methods based on the nucleotide level evaluation criteria. Implementation results show the excellent performance of the proposed algorithm in identifying protein coding regions, specifically in identification of small-scale gene areas.

Keywords: protein coding regions, period-3, anti-notch filter, Goertzel algorithm

Procedia PDF Downloads 384
43611 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution

Authors: Nikolay P. Brayanov, Anna V. Stoynova

Abstract:

Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.

Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development

Procedia PDF Downloads 240
43610 Expectation-Confirmation Model of Information System Continuance: A Meta-Analysis

Authors: Hui-Min Lai, Chin-Pin Chen, Yung-Fu Chang

Abstract:

The expectation-confirmation model (ECM) is one of the most widely used models for evaluating information system continuance, and this model has been extended to other study backgrounds, or expanded with other theoretical perspectives. However, combining ECM with other theories or investigating the background problem may produce some disparities, thus generating inaccurate conclusions. Habit is considered to be an important factor that influences the user’s continuance behavior. This paper thus critically examines seven pairs of relationships from the original ECM and the habit variable. A meta-analysis was used to tackle the development of ECM research over the last 10 years from a range of journals and conference papers published in 2005–2014. Forty-six journal articles and 19 conference papers were selected for analysis. The results confirm our prediction that a high effect size for the seven pairs of relationships was obtained (ranging from r=0.386 to r=0.588). Furthermore, a meta-analytic structural equation modeling was performed to simultaneously test all relationships. The results show that habit had a significant positive effect on continuance intention at p<=0.05 and that the six other pairs of relationships were significant at p<0.10. Based on the findings, we refined our original research model and an alternative model was proposed for understanding and predicting information system continuance. Some theoretical implications are also discussed.

Keywords: Expectation-confirmation theory, Expectation-confirmation model, Meta-analysis, meta-analytic structural equation modeling.

Procedia PDF Downloads 293
43609 Saudi Twitter Corpus for Sentiment Analysis

Authors: Adel Assiri, Ahmed Emam, Hmood Al-Dossari

Abstract:

Sentiment analysis (SA) has received growing attention in Arabic language research. However, few studies have yet to directly apply SA to Arabic due to lack of a publicly available dataset for this language. This paper partially bridges this gap due to its focus on one of the Arabic dialects which is the Saudi dialect. This paper presents annotated data set of 4700 for Saudi dialect sentiment analysis with (K= 0.807). Our next work is to extend this corpus and creation a large-scale lexicon for Saudi dialect from the corpus.

Keywords: Arabic, sentiment analysis, Twitter, annotation

Procedia PDF Downloads 626
43608 Document-level Sentiment Analysis: An Exploratory Case Study of Low-resource Language Urdu

Authors: Ammarah Irum, Muhammad Ali Tahir

Abstract:

Document-level sentiment analysis in Urdu is a challenging Natural Language Processing (NLP) task due to the difficulty of working with lengthy texts in a language with constrained resources. Deep learning models, which are complex neural network architectures, are well-suited to text-based applications in addition to data formats like audio, image, and video. To investigate the potential of deep learning for Urdu sentiment analysis, we implemented five different deep learning models, including Bidirectional Long Short Term Memory (BiLSTM), Convolutional Neural Network (CNN), Convolutional Neural Network with Bidirectional Long Short Term Memory (CNN-BiLSTM), and Bidirectional Encoder Representation from Transformer (BERT). In this study, we developed a hybrid deep learning model called BiLSTM-Single Layer Multi Filter Convolutional Neural Network (BiLSTM-SLMFCNN) by fusing BiLSTM and CNN architecture. The proposed and baseline techniques are applied on Urdu Customer Support data set and IMDB Urdu movie review data set by using pre-trained Urdu word embedding that are suitable for sentiment analysis at the document level. Results of these techniques are evaluated and our proposed model outperforms all other deep learning techniques for Urdu sentiment analysis. BiLSTM-SLMFCNN outperformed the baseline deep learning models and achieved 83%, 79%, 83% and 94% accuracy on small, medium and large sized IMDB Urdu movie review data set and Urdu Customer Support data set respectively.

Keywords: urdu sentiment analysis, deep learning, natural language processing, opinion mining, low-resource language

Procedia PDF Downloads 66
43607 How Holton’s Thematic Analysis Can Help to Understand Why Fred Hoyle Never Accepted Big Bang Cosmology

Authors: Joao Barbosa

Abstract:

After an intense dispute between the big bang cosmology and its big rival, the steady-state cosmology, some important experimental observations, such as the determination of helium abundance in the universe and the discovery of the cosmic background radiation in the 1960s were decisive for the progressive and wide acceptance of big bang cosmology and the inevitable abandonment of steady-state cosmology. But, despite solid theoretical support and those solid experimental observations favorable to big bang cosmology, Fred Hoyle, one of the proponents of the steady-state and the main opponent of the idea of the big bang (which, paradoxically, himself he baptized), never gave up and continued to fight for the idea of a stationary (or quasi-stationary) universe until the end of his life, even after decades of widespread consensus around the big bang cosmology. We can try to understand this persistent attitude of Hoyle by applying Holton’s thematic analysis to cosmology. Holton recognizes in the scientific activity a dimension that, even unconscious or not assumed, is nevertheless very important in the work of scientists, in implicit articulation with the experimental and the theoretical dimensions of science. This is the thematic dimension, constituted by themata – concepts, methodologies, and hypotheses with a metaphysical, aesthetic, logical, or epistemological nature, associated both with the cultural context and the individual psychology of scientists. In practice, themata can be expressed through personal preferences and choices that guide the individual and collective work of scientists. Thematic analysis shows that big bang cosmology is mainly based on a set of themata consisting of evolution, finitude, life cycle, and change; the cosmology of the steady-state is based on opposite themata: steady-state, infinity, continuous existence, and constancy. The passionate controversy that these cosmological views carried out is part of an old cosmological opposition: the thematic opposition between an evolutionary view of the world (associated with Heraclitus) and a stationary view (associated with Parmenides). Personal preferences seem to have been important in this (thematic) controversy, and the thematic analysis that was developed shows that Hoyle is a very illustrative example of a life-long personal commitment to some themata, in this case to the opposite themata of the big bang cosmology. His struggle against the big bang idea was strongly based on philosophical and even religious reasons – which, in a certain sense and in a Holtonian perspective, is related to thematic preferences. In this personal and persistent struggle, Hoyle always refused the way how some experimental observations were considered decisive in favor of the big bang idea, arguing that the success of this idea is based on sociological and cultural prejudices. This Hoyle’s attitude is a personal thematic attitude, in which the acceptance or rejection of what is presented as proof or scientific fact is conditioned by themata: what is a proof or a scientific fact for one scientist is something yet to be established for another scientist who defends different or even opposites themata.

Keywords: cosmology, experimental observations, fred hoyle, interpretation, life-long personal commitment, Themata

Procedia PDF Downloads 163
43606 Effects of Plant Densities on Seed Yield and Some Agricultural Characteristics of Jofs Pea Variety

Authors: Ayhan Aydoğdu, Ercan Ceyhan, Ali Kahraman, Nursel Çöl

Abstract:

This research was conducted to determine effects of plant densities on seed yield and some agricultural characteristics of pea variety- Jofs in Konya ecological conditions during 2012 vegetation period. The trial was set up according to “Randomized Blocks Design” with three replications. The material “Jofs” pea variety was subjected to 3-row spaces (30, 40 and 50 cm) and 3-row distances (5, 10 and 15 cm). According to the results, difference was shown statistically for the effects of row spaces and row distances on seed yield. The highest seed yield was 2582.1 kg ha-1 on 30 cm of row spaces while 2562.2 kg ha-1 on 15 cm of distances. Consequently, the optimum planting density was determined as 30 x 15 cm for Jofs pea variety growing in Konya.

Keywords: pea, row space, row distance, seed yield

Procedia PDF Downloads 567
43605 Institutional Segmantation and Country Clustering: Implications for Multinational Enterprises Over Standardized Management

Authors: Jung-Hoon Han, Jooyoung Kwak

Abstract:

Distances between cultures, institutions are gaining academic attention once again since the classical debate on the validity of globalization. Despite the incessant efforts to define international segments with various concepts, no significant attempts have been made considering the institutional dimensions. Resource-based theory and institutional theory provides useful insights in assessing market environment and understanding when and how MNEs loose or gain advantages. This study consists of two parts: identifying institutional clusters and predicting the effect of MNEs’ origin on the applicability of competitive advantages. MNEs in one country cluster are expected to use similar management systems.

Keywords: institutional theory, resource-based theory, institutional environment, cultural dimensions, cluster analysis, standardized management

Procedia PDF Downloads 481
43604 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences

Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng

Abstract:

Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.).

Keywords: motion detection, motion tracking, trajectory analysis, video surveillance

Procedia PDF Downloads 544
43603 Dissatisfaction as a Cause of Social Uprisings: An Empirical Analysis Utilizing the Social Uprisings Composite Indicator

Authors: Sondos Shaheen

Abstract:

This paper employs a newly constructed composite indicator of social uprisings (SUCI) to analyze the causes of their occurrence. This empirical study is based on an unbalanced panel of 45 countries over the period of 1982–2007. The paper’s contribution to the literature is distinguishing between the causes of violent and nonviolent uprisings. The analysis shows that that certain variables have a significant impact on both violent and nonviolent uprisings in terms of relative SUCI values, for example, ethnic fractionalization and mountainous terrain. Nevertheless, differences between the causes of violent and nonviolent uprisings can be found. For example, life dissatisfaction is related to nonviolent social uprisings, but when life dissatisfaction is accompanied by democratic dissatisfaction, violent social uprisings are more likely.

Keywords: social uprisings, relative deprivation, dissatisfaction, mobilization, anti-government movements, causes

Procedia PDF Downloads 223
43602 Decoding the Natural Hazards: The Data Paradox, Juggling Data Flows, Transparency and Secrets, Analysis of Khuzestan and Lorestan Floods of Iran

Authors: Kiyanoush Ghalavand

Abstract:

We have a complex paradox in the agriculture and environment sectors in the age of technology. In the one side, the achievements of the science and information ages are shaping to come that is very dangerous than ever last decades. The progress of the past decades is historic, connecting people, empowering individuals, groups, and states, and lifting a thousand people out of land and poverty in the process. Floods are the most frequent natural hazards damaging and recurring of all disasters in Iran. Additionally, floods are morphing into new and even more devastating forms in recent years. Khuzestan and Lorestan Provinces experienced heavy rains that began on March 28, 2019, and led to unprecedented widespread flooding and landslides across the provinces. The study was based on both secondary and primary data. For the present study, a questionnaire-based primary survey was conducted. Data were collected by using a specially designed questionnaire and other instruments, such as focus groups, interview schedules, inception workshops, and roundtable discussions with stakeholders at different levels. Farmers in Khuzestan and Lorestan provinces were the statistical population for this study. Data were analyzed with several software such as ATLASti, NVivo SPSS Win, ،E-Views. According to a factorial analysis conducted for the present study, 10 groups of factors were categorized climatic, economic, cultural, supportive, instructive, planning, military, policymaking, geographical, and human factors. They estimated 71.6 percent of explanatory factors of flood management obstacles in the agricultural sector in Lorestan and Khuzestan provinces. Several recommendations were finally made based on the study findings.

Keywords: chaos theory, natural hazards, risks, environmental risks, paradox

Procedia PDF Downloads 141
43601 Reference Management Software: Comparative Analysis of RefWorks and Zotero

Authors: Sujit K. Basak

Abstract:

This paper presents a comparison of reference management software between RefWorks and Zotero. The results were drawn by comparing two software and the novelty of this paper is the comparative analysis of software and it has shown that ReftWorks can import more information from the Google Scholar for the researchers. This finding could help to know researchers to use the reference management software.

Keywords: analysis, comparative analysis, reference management software, researchers

Procedia PDF Downloads 535
43600 Minimizing Vehicular Traffic via Integrated Land Use Development: A Heuristic Optimization Approach

Authors: Babu Veeregowda, Rongfang Liu

Abstract:

The current traffic impact assessment methodology and environmental quality review process for approval of land development project are conventional, stagnant, and one-dimensional. The environmental review policy and procedure lacks in providing the direction to regulate or seek alternative land uses and sizes that exploits the existing or surrounding elements of built environment (‘4 D’s’ of development – Density, Diversity, Design, and Distance to Transit) or smart growth principles which influence the travel behavior and have a significant effect in reducing vehicular traffic. Additionally, environmental review policy does not give directions on how to incorporate urban planning into the development in ways such as incorporating non-motorized roadway elements such as sidewalks, bus shelters, and access to community facilities. This research developed a methodology to optimize the mix of land uses and sizes using the heuristic optimization process to minimize the auto dependency development and to meet the interests of key stakeholders. A case study of Willets Point Mixed Use Development in Queens, New York, was used to assess the benefits of the methodology. The approved Willets Point Mixed Use project was based on maximum envelop of size and land use type allowed by current conventional urban renewal plans. This paper will also evaluate the parking accumulation for various land uses to explore the potential for shared parking to further optimize the mix of land uses and sizes. This research is very timely and useful to many stakeholders interested in understanding the benefits of integrated land uses and its development.

Keywords: traffic impact, mixed use, optimization, trip generation

Procedia PDF Downloads 209
43599 The Analysis of Secondary Case Studies as a Starting Point for Grounded Theory Studies: An Example from the Enterprise Software Industry

Authors: Abilio Avila, Orestis Terzidis

Abstract:

A fundamental principle of Grounded Theory (GT) is to prevent the formation of preconceived theories. This implies the need to start a research study with an open mind and to avoid being absorbed by the existing literature. However, to start a new study without an understanding of the research domain and its context can be extremely challenging. This paper presents a research approach that simultaneously supports a researcher to identify and to focus on critical areas of a research project and prevent the formation of prejudiced concepts by the current body of literature. This approach comprises of four stages: Selection of secondary case studies, analysis of secondary case studies, development of an initial conceptual framework, development of an initial interview guide. The analysis of secondary case studies as a starting point for a research project allows a researcher to create a first understanding of a research area based on real-world cases without being influenced by the existing body of theory. It enables a researcher to develop through a structured course of actions a firm guide that establishes a solid starting point for further investigations. Thus, the described approach may have significant implications for GT researchers who aim to start a study within a given research area.

Keywords: grounded theory, interview guide, qualitative research, secondary case studies, secondary data analysis

Procedia PDF Downloads 261
43598 Derivation of a Risk-Based Level of Service Index for Surface Street Network Using Reliability Analysis

Authors: Chang-Jen Lan

Abstract:

Current Level of Service (LOS) index adopted in Highway Capacity Manual (HCM) for signalized intersections on surface streets is based on the intersection average delay. The delay thresholds for defining LOS grades are subjective and is unrelated to critical traffic condition. For example, an intersection delay of 80 sec per vehicle for failing LOS grade F does not necessarily correspond to the intersection capacity. Also, a specific measure of average delay may result from delay minimization, delay equality, or other meaningful optimization criteria. To that end, a reliability version of the intersection critical degree of saturation (v/c) as the LOS index is introduced. Traditionally, the level of saturation at a signalized intersection is defined as the ratio of critical volume sum (per lane) to the average saturation flow (per lane) during all available effective green time within a cycle. The critical sum is the sum of the maximal conflicting movement-pair volumes in northbound-southbound and eastbound/westbound right of ways. In this study, both movement volume and saturation flow are assumed log-normal distributions. Because, when the conditions of central limit theorem obtain, multiplication of the independent, positive random variables tends to result in a log-normal distributed outcome in the limit, the critical degree of saturation is expected to be a log-normal distribution as well. Derivation of the risk index predictive limits is complex due to the maximum and absolute value operators, as well as the ratio of random variables. A fairly accurate functional form for the predictive limit at a user-specified significant level is yielded. The predictive limit is then compared with the designated LOS thresholds for the intersection critical degree of saturation (denoted as X

Keywords: reliability analysis, level of service, intersection critical degree of saturation, risk based index

Procedia PDF Downloads 128
43597 Environmental Life Cycle Assessment of Circular, Bio-Based and Industrialized Building Envelope Systems

Authors: N. Cihan KayaçEtin, Stijn Verdoodt, Alexis Versele

Abstract:

The construction industry is accounted for one-third of all waste generated in the European Union (EU) countries. The Circular Economy Action Plan of the EU aims to tackle this issue and aspires to enhance the sustainability of the construction industry by adopting more circular principles and bio-based material use. The Interreg Circular Bio-Based Construction Industry (CBCI) project was conceived to research how this adoption can be facilitated. For this purpose, an approach is developed that integrates technical, legal and social aspects and provides business models for circular designing and building with bio-based materials. In the scope of the project, the research outputs are to be displayed in a real-life setting by constructing a demo terraced single-family house, the living lab (LL) located in Ghent (Belgium). The realization of the LL is conducted in a step-wise approach that includes iterative processes for design, description, criteria definition and multi-criteria assessment of building components. The essence of the research lies within the exploratory approach to the state-of-art building envelope and technical systems options for achieving an optimum combination for a circular and bio-based construction. For this purpose, nine preliminary designs (PD) for building envelope are generated, which consist of three basic construction methods: masonry, lightweight steel construction and wood framing construction supplemented with bio-based construction methods like cross-laminated timber (CLT) and massive wood framing. A comparative analysis on the PDs was conducted by utilizing several complementary tools to assess the circularity. This paper focuses on the life cycle assessment (LCA) approach for evaluating the environmental impact of the LL Ghent. The adoption of an LCA methodology was considered critical for providing a comprehensive set of environmental indicators. The PDs were developed at the component level, in particular for the (i) inclined roof, (ii-iii) front and side façade, (iv) internal walls and (v-vi) floors. The assessment was conducted on two levels; component and building level. The options for each component were compared at the first iteration and then, the PDs as an assembly of components were further analyzed. The LCA was based on a functional unit of one square meter of each component and CEN indicators were utilized for impact assessment for a reference study period of 60 years. A total of 54 building components that are composed of 31 distinct materials were evaluated in the study. The results indicate that wood framing construction supplemented with bio-based construction methods performs environmentally better than the masonry or steel-construction options. An analysis on the correlation between the total weight of components and environmental impact was also conducted. It was seen that masonry structures display a high environmental impact and weight, steel structures display low weight but relatively high environmental impact and wooden framing construction display low weight and environmental impact. The study provided valuable outputs in two levels: (i) several improvement options at component level with substitution of materials with critical weight and/or impact per unit, (ii) feedback on environmental performance for the decision-making process during the design phase of a circular single family house.

Keywords: circular and bio-based materials, comparative analysis, life cycle assessment (LCA), living lab

Procedia PDF Downloads 182
43596 Hormones and Mineral Elements Associated with Osteoporosis in Postmenopausal Women in Eastern Slovakia

Authors: M. Mydlárová Blaščáková, J. Poráčová, Z. Tomková, Ľ. Blaščáková, M. Nagy, M. Konečná, E. Petrejčíková, Z. Gogaľová, V. Sedlák, J. Mydlár, M. Zahatňanská, K. Hricová

Abstract:

Osteoporosis is a multifactorial disease that results in reduced quality of life, causes decreased bone strength, and changes in their microarchitecture. Mostly postmenopausal women are at risk. In our study, we measured anthropometric parameters of postmenopausal women (104 women of control group – CG and 105 women of osteoporotic group - OG) and determined TSH hormone levels and PTH as well as mineral elements - Ca, P, Mg and enzyme alkaline phosphatase. Through the correlation analysis in CG, we have found association based on age and BMI, P and Ca, as well as Mg and Ca; in OG we determined interdependence based on an association of age and BMI, age and Ca. Using the Student's t test, we found significantly important differences in biochemical parameters of Mg (p ˂ 0,001) and TSH (p ˂ 0,05) between CG and OG.

Keywords: factors, bone mass density, Central Europe, biomarkers

Procedia PDF Downloads 192
43595 Structure and Activity Research of Hydrocarbons Refining Catalysts Based on Wastes of Ferroalloy Production

Authors: Zhanat Shomanova, Ruslan Safarov, Yuri Nosenko, Zheneta Tashmuchambetova, Alima Zharmagambetova

Abstract:

An effective way of utilization of ferroalloy production wastes is preparing hydrocarbon refining catalysts from them. It is possible due to accordable transition metals containing in the wastes. In the work, we are presenting the results on elemental analysis of sludge samples from Aksu ferroalloy plant (Aksu, Kazakhstan), method of catalysts preparing, results of physical-chemical analysis of obtained catalysts (X-ray analysis, electron microscopy, the BET method etc.), results of using the catalysts in some hydrocarbons refining processes such as hydrocracking of rubber waste, cracking of gasoil, oxidation of cyclohexane. The main results of catalytic activity research are: a) In hydrocracking of rubber waste 64.9% of liquid products were fuel fractions; b) In cracking of gasoil conversion was 51% and selectivity by liquid products was 99%; c) In oxidation of cyclohexane the maximal product yield 87.9% and selectivity by cyclohexanol 93.0% were achieved.

Keywords: catalyst, cyclohexane oxidation, ferroalloy production waste, gasoil cracking

Procedia PDF Downloads 264
43594 Techno-Economic Optimization and Evaluation of an Integrated Industrial Scale NMC811 Cathode Active Material Manufacturing Process

Authors: Usama Mohamed, Sam Booth, Aliysn J. Nedoma

Abstract:

As part of the transition to electric vehicles, there has been a recent increase in demand for battery manufacturing. Cathodes typically account for approximately 50% of the total lithium-ion battery cell cost and are a pivotal factor in determining the viability of new industrial infrastructure. Cathodes which offer lower costs whilst maintaining or increasing performance, such as nickel-rich layered cathodes, have a significant competitive advantage when scaling up the manufacturing process. This project evaluates the techno-economic value proposition of an integrated industrial scale cathode active material (CAM) production process, closing the mass and energy balances, and optimizing the operation conditions using a sensitivity analysis. This is done by developing a process model of a co-precipitation synthesis route using Aspen Plus software and validated based on experimental data. The mechanism chemistry and equilibrium conditions were established based on previous literature and HSC-Chemistry software. This is then followed by integrating the energy streams, adding waste recovery and treatment processes, as well as testing the effect of key parameters (temperature, pH, reaction time, etc.) on CAM production yield and emissions. Finally, an economic analysis estimating the fixed and variable costs (including capital expenditure, labor costs, raw materials, etc.) to calculate the cost of CAM ($/kg and $/kWh), total plant cost ($) and net present value (NPV). This work sets the foundational blueprint for future research into sustainable industrial scale processes for CAM manufacturing.

Keywords: cathodes, industrial production, nickel-rich layered cathodes, process modelling, techno-economic analysis

Procedia PDF Downloads 96
43593 Magnetic Navigation of Nanoparticles inside a 3D Carotid Model

Authors: E. G. Karvelas, C. Liosis, A. Theodorakakos, T. E. Karakasidis

Abstract:

Magnetic navigation of the drug inside the human vessels is a very important concept since the drug is delivered to the desired area. Consequently, the quantity of the drug required to reach therapeutic levels is being reduced while the drug concentration at targeted sites is increased. Magnetic navigation of drug agents can be achieved with the use of magnetic nanoparticles where anti-tumor agents are loaded on the surface of the nanoparticles. The magnetic field that is required to navigate the particles inside the human arteries is produced by a magnetic resonance imaging (MRI) device. The main factors which influence the efficiency of the usage of magnetic nanoparticles for biomedical applications in magnetic driving are the size and the magnetization of the biocompatible nanoparticles. In this study, a computational platform for the simulation of the optimal gradient magnetic fields for the navigation of magnetic nanoparticles inside a carotid artery is presented. For the propulsion model of the particles, seven major forces are considered, i.e., the magnetic force from MRIs main magnet static field as well as the magnetic field gradient force from the special propulsion gradient coils. The static field is responsible for the aggregation of nanoparticles, while the magnetic gradient contributes to the navigation of the agglomerates that are formed. Moreover, the contact forces among the aggregated nanoparticles and the wall and the Stokes drag force for each particle are considered, while only spherical particles are used in this study. In addition, gravitational forces due to gravity and the force due to buoyancy are included. Finally, Van der Walls force and Brownian motion are taken into account in the simulation. The OpenFoam platform is used for the calculation of the flow field and the uncoupled equations of particles' motion. To verify the optimal gradient magnetic fields, a covariance matrix adaptation evolution strategy (CMAES) is used in order to navigate the particles into the desired area. A desired trajectory is inserted into the computational geometry, which the particles are going to be navigated in. Initially, the CMAES optimization strategy provides the OpenFOAM program with random values of the gradient magnetic field. At the end of each simulation, the computational platform evaluates the distance between the particles and the desired trajectory. The present model can simulate the motion of particles when they are navigated by the magnetic field that is produced by the MRI device. Under the influence of fluid flow, the model investigates the effect of different gradient magnetic fields in order to minimize the distance of particles from the desired trajectory. In addition, the platform can navigate the particles into the desired trajectory with an efficiency between 80-90%. On the other hand, a small number of particles are stuck to the walls and remains there for the rest of the simulation.

Keywords: artery, drug, nanoparticles, navigation

Procedia PDF Downloads 103
43592 Methodology to Achieve Non-Cooperative Target Identification Using High Resolution Range Profiles

Authors: Olga Hernán-Vega, Patricia López-Rodríguez, David Escot-Bocanegra, Raúl Fernández-Recio, Ignacio Bravo

Abstract:

Non-Cooperative Target Identification has become a key research domain in the Defense industry since it provides the ability to recognize targets at long distance and under any weather condition. High Resolution Range Profiles, one-dimensional radar images where the reflectivity of a target is projected onto the radar line of sight, are widely used for identification of flying targets. According to that, to face this problem, an approach to Non-Cooperative Target Identification based on the exploitation of Singular Value Decomposition to a matrix of range profiles is presented. Target Identification based on one-dimensional radar images compares a collection of profiles of a given target, namely test set, with the profiles included in a pre-loaded database, namely training set. The classification is improved by using Singular Value Decomposition since it allows to model each aircraft as a subspace and to accomplish recognition in a transformed domain where the main features are easier to extract hence, reducing unwanted information such as noise. Singular Value Decomposition permits to define a signal subspace which contain the highest percentage of the energy, and a noise subspace which will be discarded. This way, only the valuable information of each target is used in the recognition process. The identification algorithm is based on finding the target that minimizes the angle between subspaces and takes place in a transformed domain. Two metrics, F1 and F2, based on Singular Value Decomposition are accomplished in the identification process. In the case of F2, the angle is weighted, since the top vectors set the importance in the contribution to the formation of a target signal, on the contrary F1 simply shows the evolution of the unweighted angle. In order to have a wide database or radar signatures and evaluate the performance, range profiles are obtained through numerical simulation of seven civil aircraft at defined trajectories taken from an actual measurement. Taking into account the nature of the datasets, the main drawback of using simulated profiles instead of actual measured profiles is that the former implies an ideal identification scenario, since measured profiles suffer from noise, clutter and other unwanted information and simulated profiles don't. In this case, the test and training samples have similar nature and usually a similar high signal-to-noise ratio, so as to assess the feasibility of the approach, the addition of noise has been considered before the creation of the test set. The identification results applying the unweighted and weighted metrics are analysed for demonstrating which algorithm provides the best robustness against noise in an actual possible scenario. So as to confirm the validity of the methodology, identification experiments of profiles coming from electromagnetic simulations are conducted, revealing promising results. Considering the dissimilarities between the test and training sets when noise is added, the recognition performance has been improved when weighting is applied. Future experiments with larger sets are expected to be conducted with the aim of finally using actual profiles as test sets in a real hostile situation.

Keywords: HRRP, NCTI, simulated/synthetic database, SVD

Procedia PDF Downloads 350
43591 Developing a Knowledge-Based Lean Six Sigma Model to Improve Healthcare Leadership Performance

Authors: Yousuf N. Al Khamisi, Eduardo M. Hernandez, Khurshid M. Khan

Abstract:

Purpose: This paper presents a model of a Knowledge-Based (KB) using Lean Six Sigma (L6σ) principles to enhance the performance of healthcare leadership. Design/methodology/approach: Using L6σ principles to enhance healthcare leaders’ performance needs a pre-assessment of the healthcare organisation’s capabilities. The model will be developed using a rule-based approach of KB system. Thus, KB system embeds Gauging Absence of Pre-requisite (GAP) for benchmarking and Analytical Hierarchy Process (AHP) for prioritization. A comprehensive literature review will be covered for the main contents of the model with a typical output of GAP analysis and AHP. Findings: The proposed KB system benchmarks the current position of healthcare leadership with the ideal benchmark one (resulting from extensive evaluation by the KB/GAP/AHP system of international leadership concepts in healthcare environments). Research limitations/implications: Future work includes validating the implementation model in healthcare environments around the world. Originality/value: This paper presents a novel application of a hybrid KB combines of GAP and AHP methodology. It implements L6σ principles to enhance healthcare performance. This approach assists healthcare leaders’ decision making to reach performance improvement against a best practice benchmark.

Keywords: Lean Six Sigma (L6σ), Knowledge-Based System (KBS), healthcare leadership, Gauge Absence Prerequisites (GAP), Analytical Hierarchy Process (AHP)

Procedia PDF Downloads 163
43590 Numerical Regularization of Ill-Posed Problems via Hybrid Feedback Controls

Authors: Eugene Stepanov, Arkadi Ponossov

Abstract:

Many mathematical models used in biological and other applications are ill-posed. The reason for that is the nature of differential equations, where the nonlinearities are assumed to be step functions, which is done to simplify the analysis. Prominent examples are switched systems arising from gene regulatory networks and neural field equations. This simplification leads, however, to theoretical and numerical complications. In the presentation, it is proposed to apply the theory of hybrid feedback controls to regularize the problem. Roughly speaking, one attaches a finite state control (‘automaton’), which follows the trajectories of the original system and governs its dynamics at the points of ill-posedness. The construction of the automaton is based on the classification of the attractors of the specially designed adjoint dynamical system. This ‘hybridization’ is shown to regularize the original switched system and gives rise to efficient hybrid numerical schemes. Several examples are provided in the presentation, which supports the suggested analysis. The method can be of interest in other applied fields, where differential equations contain step-like nonlinearities.

Keywords: hybrid feedback control, ill-posed problems, singular perturbation analysis, step-like nonlinearities

Procedia PDF Downloads 239
43589 Research on Strategies of Building a Child Friendly City in Wuhan

Authors: Tianyue Wan

Abstract:

Building a child-friendly city (CFC) contributes to improving the quality of urbanization. It also forms a local system committed to fulfilling children's rights and development. Yet, the work related to CFC is still at the initial stage in China. Therefore, taking Wuhan, the most populous city in central China, as the pilot city would offer some reference for other cities. Based on the analysis of theories and practice examples, this study puts forward the challenges of building a child-friendly city under the particularity of China's national conditions. To handle these challenges, this study uses four methods to collect status data: literature research, site observation, research inquiry, and semantic differential (SD). And it adopts three data analysis methods: case analysis, geographic information system (GIS) analysis, and analytic hierarchy process (AHP) method. Through data analysis, this study identifies the evaluation system and appraises the current situation of Wuhan. According to the status of Wuhan's child-friendly city, this study proposes three strategies: 1) construct the evaluation system; 2) establish a child-friendly space system integrating 'point-line-surface'; 3) build a digitalized service platform. At the same time, this study suggests building a long-term mechanism for children's participation and multi-subject supervision from laws, medical treatment, education, safety protection, social welfare, and other aspects. Finally, some conclusions of strategies about CFC are tried to be drawn to promote the highest quality of life for all citizens in Wuhan.

Keywords: action plan, child friendly city, construction strategy, urban space

Procedia PDF Downloads 89
43588 Analysis of a Damage-Control Target Displacement of Reinforced Concrete Bridge Pier for Seismic Design

Authors: Mohd Ritzman Abdul Karim, Zhaohui Huang

Abstract:

A current focus in seismic engineering practice is the development of seismic design approach that focuses on the performance-based design. Performance-based design aims to design the structures to achieve specified performance based on the damage limit states. This damage limit is more restrictive limit than life safety and needs to be carefully estimated to avoid damage in piers due to failure in transverse reinforcement. In this paper, a different perspective of damage limit states has been explored by integrating two damage control material limit state, concrete and reinforcement by introduced parameters such as expected yield stress of transverse reinforcement where peak tension strain prior to bar buckling is introduced in a recent study. The different perspective of damage limit states with modified yield displacement and the modified plastic-hinge length is used in order to predict damage-control target displacement for reinforced concreate (RC) bridge pier. Three-dimensional (3D) finite element (FE) model has been developed for estimating damage target displacement to validate proposed damage limit states. The result from 3D FE analysis was validated with experimental study found in the literature. The validated model then was applied to predict the damage target displacement for RC bridge pier and to validate the proposed study. The tensile strain on reinforcement and compression on concrete were used to determine the predicted damage target displacement and compared with the proposed study. The result shows that the proposed damage limit states were efficient in predicting damage-control target displacement consistent with FE simulations.

Keywords: damage-control target displacement, damage limit states, reinforced concrete bridge pier, yield displacement

Procedia PDF Downloads 153
43587 A Quadratic Approach for Generating Pythagorean Triples

Authors: P. K. Rahul Krishna, S. Sandeep Kumar, Jayanthi Sunder Raj

Abstract:

The article explores one of the important relations between numbers-the Pythagorean triples (triplets) which finds its application in distance measurement, construction of roads, towers, buildings and wherever Pythagoras theorem finds its application. The Pythagorean triples are numbers, that satisfy the condition “In a given set of three natural numbers, the sum of squares of two natural numbers is equal to the square of the other natural number”. There are numerous methods and equations to obtain the triplets, which have their own merits and demerits. Here, quadratic approach for generating triples uses the hypotenuse leg difference method. The advantage is that variables are few and finally only three independent variables are present.

Keywords: arithmetic progression, hypotenuse leg difference method, natural numbers, Pythagorean triplets, quadratic equation

Procedia PDF Downloads 201
43586 Integrating Machine Learning and Rule-Based Decision Models for Enhanced B2B Sales Forecasting and Customer Prioritization

Authors: Wenqi Liu, Reginald Bailey

Abstract:

This study explores an advanced approach to enhancing B2B sales forecasting by integrating machine learning models with a rule-based decision framework. The methodology begins with the development of a machine learning classification model to predict conversion likelihood, aiming to improve accuracy over traditional methods like logistic regression. The classification model's effectiveness is measured using metrics such as accuracy, precision, recall, and F1 score, alongside a feature importance analysis to identify key predictors. Following this, a machine learning regression model is used to forecast sales value, with the objective of reducing mean absolute error (MAE) compared to linear regression techniques. The regression model's performance is assessed using MAE, root mean square error (RMSE), and R-squared metrics, emphasizing feature contribution to the prediction. To bridge the gap between predictive analytics and decision-making, a rule-based decision model is introduced that prioritizes customers based on predefined thresholds for conversion probability and predicted sales value. This approach significantly enhances customer prioritization and improves overall sales performance by increasing conversion rates and optimizing revenue generation. The findings suggest that this combined framework offers a practical, data-driven solution for sales teams, facilitating more strategic decision-making in B2B environments.

Keywords: sales forecasting, machine learning, rule-based decision model, customer prioritization, predictive analytics

Procedia PDF Downloads 4
43585 Logical-Probabilistic Modeling of the Reliability of Complex Systems

Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia

Abstract:

The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.

Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements

Procedia PDF Downloads 63