Search results for: critical features
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8670

Search results for: critical features

7020 Art Street as a Way for Reflective Thinking in the Filed of Adult and Primary Education: Examples of Educational Techniques

Authors: Georgia H. Mega

Abstract:

Art street, a category of artwork displayed in public spaces, has been recognized as a potential tool for promoting reflective thinking in both adult and primary education. Educational techniques that encourage critical and creative thinking, as well as deeper reflection, have been developed and applied in educational curricula. This paper aims to explore the potential of art street in cultivating learners' reflective awareness toward multiculturalism. The main objective of this case study is to investigate the possibilities that art street offers in terms of developing learners' critical reflection, regardless of their age. The study compares two art street works from Greece and Norway, focusing on their common theme of multiculturalism. The study adopts a qualitative methodology, specifically a case study approach. This approach allows for an in-depth analysis of the two selected art street works and their impact on learners' reflective thinking. The study demonstrates that art street can effectively cultivate learners' reflective awareness of multiculturalism. The selected works of art, despite being created by different artists and displayed in different cities, share similar content and convey messages that facilitate reflective dialogue on cultural osmosis. Both adult and primary education approaches utilize the same art street works to achieve reflective awareness. This paper contributes to the existing literature on reflective learning processes by highlighting the potential of art street as a means for encouraging reflective thinking. It builds upon the theoretical frameworks of adult education theorists such as Freire and Mezirow, as well as those of primary education theorists such as Perkins and Project Zero. Data for this study were collected through observation and analysis of two art street works, one from Greece and one from Norway. These works were selected based on their common theme of multiculturalism. Analysis Procedures: The collected data were analyzed using qualitative analysis techniques. The researchers examined the content and messages conveyed by the selected art street works and explored their impact on learners' reflective thinking. The central question addressed in this study is whether art street can develop learners' critical reflection toward multiculturalism, regardless of their age. The findings of this study support the notion that art street can effectively cultivate learners' reflective awareness toward multiculturalism. The selected art street works, despite their differences in origin and location, share common themes that encourage reflective dialogue. The use of art street in both adult and primary education approaches showcases its potential as a tool for promoting reflective learning processes. Overall, this paper contributes to the understanding of art street as a means for reflective thinking in the field of adult and primary education.

Keywords: art street, educational techniques, multiculturalism, observation of artworks, reflective awareness

Procedia PDF Downloads 60
7019 A Mathematical Framework for Expanding a Railway’s Theoretical Capacity

Authors: Robert L. Burdett, Bayan Bevrani

Abstract:

Analytical techniques for measuring and planning railway capacity expansion activities have been considered in this article. A preliminary mathematical framework involving track duplication and section sub divisions is proposed for this task. In railways, these features have a great effect on network performance and for this reason they have been considered. Additional motivations have also arisen from the limitations of prior models that have not included them.

Keywords: capacity analysis, capacity expansion, railways, track sub division, track duplication

Procedia PDF Downloads 344
7018 Geometric Model to Study the Mechanism of Machining and Predict the Damage Occurring During Milling of Unidirectional CFRP

Authors: Faisal Islam, J. Ramkumar

Abstract:

The applications of composite materials in aerospace, sporting and automotive industries need high quality machined surfaces and dimensional accuracy. Some studies have been done to understand the fiber failure mechanisms encountered during milling machining of CFRP composites but none are capable of explaining the exact nature of the orientation-based fiber failure mechanisms encountered in the milling machining process. The objective of this work is to gain a better understanding of the orientation-based fiber failure mechanisms occurring on the slot edges during CFRP milling machining processes. The occurrence of damage is predicted by a schematic explanation based on the mechanisms of material removal which in turn depends upon fiber cutting angles. A geometric model based on fiber cutting angle and fiber orientation angle is proposed that defines the critical and safe zone during machining and predicts the occurrence of delamination. Milling machining experiments were performed on composite samples of varying fiber orientations to verify the proposed theory. Mean fiber pulled out length was measured from the microscopic images of the damaged area to quantify the amount of damage produced. By observing the damage occurring for different fiber orientation angles and fiber cutting angles for up-milling and down-milling edges and correlating it with the material removal mechanisms as described earlier, it can be concluded that the damage/delamination mainly depends on the portion of the fiber cutting angles that lies within the critical cutting angle zone.

Keywords: unidirectional composites, milling, machining damage, delamination, carbon fiber reinforced plastics (CFRPs)

Procedia PDF Downloads 510
7017 Media Representation of Romanian Migrants in the Italian Media: A Comparative Study

Authors: Paula-Catalina Meirosu

Abstract:

The economic migration (intra-EU) is a topic of debate in the public space in both countries of origin and countries of destination. Since the 1990s, after the collapse of communist regimes and then the accession of some former communist countries to the EU, the migratory flows of migrants (including Romanian migrants) to EU countries has been increased constantly. Italy is one of the main countries of destination among Romanians since at the moment Italy hosts more than one million Romanian migrants. Based on an interdisciplinary analytical framework focused on the theories in the field of transnationalism, media and migration studies and critical media analysis, this paper investigates the media construction of intra-EU economic migration in the Italian press from two main perspectives. The first point of view is the media representation of Romanian migrants in the Italian press in a specific context: the EU elections in 2014. The second one explores the way in which Romanian journalists use the media in the destinations countries (such as Italy) as a source to address the issue of migration. In this context, the paper focuses on online articles related to the Romanian migrants’ representation in the media before and during the EU elections in two newspapers (La Repubblica from Italy and Adevarul from Romania), published during January-May 2014. The methodology is based on a social-constructivist approach, predominantly discursive and includes elements of critical discourse analysis (CDA) to identify the patterns of Romanian migrants in the Italian press as well as strategies for building categories, identities, and roles of migrants. The aim of such an approach is to find out the dynamic of the media discourse on migration from a destination country in the light of a European electoral context (EU elections) and based on the results, to propose scenarios for the elections to be held this year.

Keywords: migration, media discourse, Romanian migrants, transnationalism

Procedia PDF Downloads 114
7016 Primary Level Teachers’ Response to Gender Representation in Textbook Contents

Authors: Pragya Paneru

Abstract:

This paper explores ten primary teachers’ views on gender representation in primary-level textbooks altogether. Data was collected from the teachers who taught in private schools in Kailali and Kathmandu District. This research uses a semi-structured interview method to obtain information regarding teachers’ attitudes toward gender representations in textbook content. The interview data were analysed by using critical skills of qualitative research analysis methods, as suggested by Saldana and Omasta (2018). The findings revealed that most of the teachers were unaware and regarded gender issues as insignificant to discuss in primary-level classes. Most of them responded to the questions personally and claimed that there were no gender issues in their classrooms. Some of the teachers connected gender issues with contexts other than textbook representations, such as school discrimination in the distribution of salary among male and female teachers, school practices of awarding girls rather than boys as the most disciplined students, following girls’ first rule in the assembly marching, encouraging only girls in the stage shows, and involving students in gender-specific activities such as decorating works for girls and physical tasks for boys. The interview also revealed teachers’ covert gendered attitudes in their remarks. Nevertheless, most of the teachers accepted that gender-biased contents have an impact on learners, and this problem can be solved with more gender-centred research in the education field, discussions, and training to increase awareness regarding gender issues. Agreeing with the suggestion of teachers, this paper recommends proper training and awareness regarding how to confront gender issues in textbooks.

Keywords: content analysis, gender equality, school education, critical awareness

Procedia PDF Downloads 78
7015 The Healing Theatre: Beyond Alienation and Fixation Discourse of Three Theatrical Personalities in Bode Ojoniyi’s Dramaturgy

Authors: Oluwafemi Akinlawon Atoyebi

Abstract:

This paper examines alienation and fixation as critical issues of/around mental health -crisis, sickness, and healing- through ‘Bode Ojoniyi’s dramaturgy. Two of his dramatic memoirs, arguably written to address such a life-threatening crisis between him and his employer, where he externalizes perhaps his psychological crisis, are critically analysed. This is done through a reading of the three theatrical phenomena of the actor, the character, and the audience against how he plays around the concepts of alienation and fixation within the totality of his dramaturgy beyond what could be seen as a mere academic exercise. The paper situates his apt understanding of their representations as a reflective force of a consciousness that defies psychosomatic existential conflicts. It does so by adopting a qualitative method of analysis through a critical reading of the two dramatic memoirs. It also carries out a survey on the audience that experienced the performances of the memoirs and an interview with Ojoniyi. Using Jean-Paul Sartre’s Theory of Existential Consciousness, the study discovers that there is a way the three phenomena of the actor, the character, and the audience do find expression in Ojoniyi as an existential omniscient playwright-actor-character-audience who is able to transcend the parochialism of an alienated and a fixated self; that beyond the limiting artistic purview, the theatre as a stage is a phenomenon that is capable of capturing the totality of the experiences of a man in his world and that, often time, the depressed are victims of the myopic syndrome as they probably could not see or reflect on/about their realities beyond the self and the play of a casual order. The study concludes that the therapeutic effect of Ojoniyi’s dramatic memoirs, in their reading or performance, is needed by all and should be explored in proffering cures for psychosomatic patients, for it promises to be essentially useful beyond its confine –the Arts.

Keywords: alienation, fixation, the healing theatre, theatrical personalities

Procedia PDF Downloads 122
7014 A Trends Analysis of Yatch Simulator

Authors: Jae-Neung Lee, Keun-Chang Kwak

Abstract:

This paper describes an analysis of Yacht Simulator international trends and also explains about Yacht. Examples of yacht Simulator using Yacht Simulator include image processing for totaling the total number of vehicles, edge/target detection, detection and evasion algorithm, image processing using SIFT (scale invariant features transform) matching, and application of median filter and thresholding.

Keywords: yacht simulator, simulator, trends analysis, SIFT

Procedia PDF Downloads 417
7013 Contractors Perspective on Causes of Delays in Power Transmission Projects

Authors: Goutom K. Pall

Abstract:

At the very heart of the power system, power transmission (PT) acts as an essential link between power generation and distribution. Timely completion of PT infrastructures is therefore crucial to support the development of power system as a whole. Yet despite the importance, studies on PT infrastructure development projects are embryonic and, hence, PT projects undergoing widespread delays worldwide. These delay factors are idiosyncratic and identifying the critical delay factors is essential if the PT industry professionals are to complete their projects efficiently and within the expected timeframes. This study identifies and categorizes 46 causes of PT project delay under ten major groups using six sector expert’s recommendations studied by a preliminary questionnaire survey. Based on the experts’ strong recommendations, two new groups are introduced in the final questionnaire survey: sector specific factors (SSF) and general factors (GF). SSF pertain to delay factors applicable only to the PT projects, while GF represents less biased samples with shared responsibilities of all project parties involved in a project. The study then uses 112 data samples from the contractors to rank the delay factors using relative importance index (RII). The results reveal that SSF, GF and external factors are the most critical groups, while the highest ranked delay factors include the right of way (RoW) problems of transmission lines (TL), delay in payments, frequent changes in TL routes, poor communication and coordination among the project parties and accessibility to TL tower locations. Finally, recommendations are made to minimize the identified delay. The findings are expected to be of substantial benefit to professionals in minimizing time overrun in PT projects implementation, as well as power generation, power distribution, and non-power linear construction projects worldwide.

Keywords: delay, project delay, power transmission projects, time-overruns

Procedia PDF Downloads 170
7012 Effects of Nitrogen Addition on Litter Decomposition and Nutrient Release in a Temperate Grassland in Northern China

Authors: Lili Yang, Jirui Gong, Qinpu Luo, Min Liu, Bo Yang, Zihe Zhang

Abstract:

Anthropogenic activities have increased nitrogen (N) inputs to grassland ecosystems. Knowledge of the impact of N addition on litter decomposition is critical to understand ecosystem carbon cycling and their responses to global climate change. The aim of this study was to investigate the effects of N addition and litter types on litter decomposition of a semi-arid temperate grassland during growing and non-growing seasons in Inner Mongolia, northern China, and to identify the relation between litter decomposition and C: N: P stoichiometry in the litter-soil continuum. Six levels of N addition were conducted: CK, N1 (0 g Nm−2 yr−1), N2 (2 g Nm−2 yr−1), N3 (5 g Nm−2 yr−1), N4 (10 g Nm−2 yr−1) and N5 (25 g Nm−2 yr−1). Litter decomposition rates and nutrient release differed greatly among N addition gradients and litter types. N addition promoted litter decomposition of S. grandis, but exhibited no significant influence on L. chinensis litter, indicating that the S. grandis litter decomposition was more sensitive to N addition than L. chinensis. The critical threshold for N addition to promote mixed litter decomposition was 10 -25g Nm−2 yr−1. N addition altered the balance of C: N: P stoichiometry between litter, soil and microbial biomass. During decomposition progress, the L. chinensis litter N: P was higher in N2-N4 plots compared to CK, while the S. grandis litter C: N was lower in N3 and N4 plots, indicating that litter N or P content doesn’t satisfy microbial decomposers with the increasing of N addition. As a result, S. grandis litter exhibited net N immobilization, while L. chinensis litter net P immobilization. Mixed litter C: N: P stoichiometry satisfied the demand of microbial decomposers, showed net mineralization during the decomposition process. With the increasing N deposition in the future, mixed litter would potentially promote C and nutrient cycling in grassland ecosystem by increasing litter decomposition and nutrient release.

Keywords: C: N: P stoichiometry, litter decomposition, nitrogen addition, nutrient release

Procedia PDF Downloads 471
7011 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals

Authors: Christine F. Boos, Fernando M. Azevedo

Abstract:

Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.

Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing

Procedia PDF Downloads 513
7010 Real Estate Trend Prediction with Artificial Intelligence Techniques

Authors: Sophia Liang Zhou

Abstract:

For investors, businesses, consumers, and governments, an accurate assessment of future housing prices is crucial to critical decisions in resource allocation, policy formation, and investment strategies. Previous studies are contradictory about macroeconomic determinants of housing price and largely focused on one or two areas using point prediction. This study aims to develop data-driven models to accurately predict future housing market trends in different markets. This work studied five different metropolitan areas representing different market trends and compared three-time lagging situations: no lag, 6-month lag, and 12-month lag. Linear regression (LR), random forest (RF), and artificial neural network (ANN) were employed to model the real estate price using datasets with S&P/Case-Shiller home price index and 12 demographic and macroeconomic features, such as gross domestic product (GDP), resident population, personal income, etc. in five metropolitan areas: Boston, Dallas, New York, Chicago, and San Francisco. The data from March 2005 to December 2018 were collected from the Federal Reserve Bank, FBI, and Freddie Mac. In the original data, some factors are monthly, some quarterly, and some yearly. Thus, two methods to compensate missing values, backfill or interpolation, were compared. The models were evaluated by accuracy, mean absolute error, and root mean square error. The LR and ANN models outperformed the RF model due to RF’s inherent limitations. Both ANN and LR methods generated predictive models with high accuracy ( > 95%). It was found that personal income, GDP, population, and measures of debt consistently appeared as the most important factors. It also showed that technique to compensate missing values in the dataset and implementation of time lag can have a significant influence on the model performance and require further investigation. The best performing models varied for each area, but the backfilled 12-month lag LR models and the interpolated no lag ANN models showed the best stable performance overall, with accuracies > 95% for each city. This study reveals the influence of input variables in different markets. It also provides evidence to support future studies to identify the optimal time lag and data imputing methods for establishing accurate predictive models.

Keywords: linear regression, random forest, artificial neural network, real estate price prediction

Procedia PDF Downloads 91
7009 Reduced Lung Volume: A Possible Cause of Stuttering

Authors: Shantanu Arya, Sachin Sakhuja, Gunjan Mehta, Sanjay Munjal

Abstract:

Stuttering may be defined as a speech disorder affecting the fluency domain of speech and characterized by covert features like word substitution, omittance and circumlocution and overt features like prolongation of sound, syllables and blocks etc. Many etiologies have been postulated to explain stuttering based on various experiments and research. Moreover, Breathlessness has also been reported by many individuals with stuttering for which breathing exercises are generally advised. However, no studies reporting objective evaluation of the pulmonary capacity and further objective assessment of the efficacy of breathing exercises have been conducted. Pulmonary Function Test which evaluates parameters like Forced Vital Capacity, Peak Expiratory Flow Rate, Forced expiratory flow Rate can be used to study the pulmonary behavior of individuals with stuttering. The study aimed: a) To identify speech motor & physiologic behaviours associated with stuttering by administering PFT. b) To recognize possible reasons for an association between speech motor behaviour & stuttering severity. In this regard, PFT tests were administered on individuals who reported signs and symptoms of stuttering and showed abnormal scores on Stuttering Severity Index. Parameters like Forced Vital Capacity, Forced Expiratory Volume, Peak Expiratory Flow Rate (L/min), Forced Expiratory Flow Rate (L/min) were evaluated and correlated with scores of Stuttering Severity Index. Results showed significant decrease in the parameters (lower than normal scores) in individuals with established stuttering. Strong correlation was also found between degree of stuttering and the degree of decrease in the pulmonary volumes. Thus, it is evident that fluent speech requires strong support of lung pressure and requisite volumes. Further research in demonstrating the efficacy of abdominal breathing exercises in this regard is needed.

Keywords: forced expiratory flow rate, forced expiratory volume, forced vital capacity, peak expiratory flow rate, stuttering

Procedia PDF Downloads 254
7008 Bio-Hub Ecosystems: Investment Risk Analysis Using Monte Carlo Techno-Economic Analysis

Authors: Kimberly Samaha

Abstract:

In order to attract new types of investors into the emerging Bio-Economy, new methodologies to analyze investment risk are needed. The Bio-Hub Ecosystem model was developed to address a critical area of concern within the global energy market regarding the use of biomass as a feedstock for power plants. This study looked at repurposing existing biomass-energy plants into Circular Zero-Waste Bio-Hub Ecosystems. A Bio-Hub model that first targets a ‘whole-tree’ approach and then looks at the circular economics of co-hosting diverse industries (wood processing, aquaculture, agriculture) in the vicinity of the Biomass Power Plants facilities. This study modeled the economics and risk strategies of cradle-to-cradle linkages to incorporate the value-chain effects on capital/operational expenditures and investment risk reductions using a proprietary techno-economic model that incorporates investment risk scenarios utilizing the Monte Carlo methodology. The study calculated the sequential increases in profitability for each additional co-host on an operating forestry-based biomass energy plant in West Enfield, Maine. Phase I starts with the base-line of forestry biomass to electricity only and was built up in stages to include co-hosts of a greenhouse and a land-based shrimp farm. Phase I incorporates CO2 and heat waste streams from the operating power plant in an analysis of lowering and stabilizing the operating costs of the agriculture and aquaculture co-hosts. Phase II analysis incorporated a jet-fuel biorefinery and its secondary slip-stream of biochar which would be developed into two additional bio-products: 1) A soil amendment compost for agriculture and 2) A biochar effluent filter for the aquaculture. The second part of the study applied the Monte Carlo risk methodology to illustrate how co-location derisks investment in an integrated Bio-Hub versus individual investments in stand-alone projects of energy, agriculture or aquaculture. The analyzed scenarios compared reductions in both Capital and Operating Expenditures, which stabilizes profits and reduces the investment risk associated with projects in energy, agriculture, and aquaculture. The major findings of this techno-economic modeling using the Monte Carlo technique resulted in the masterplan for the first Bio-Hub to be built in West Enfield, Maine. In 2018, the site was designated as an economic opportunity zone as part of a Federal Program, which allows for Capital Gains tax benefits for investments on the site. Bioenergy facilities are currently at a critical juncture where they have an opportunity to be repurposed into efficient, profitable and socially responsible investments, or be idled and scrapped. The Bio-hub Ecosystems techno-economic analysis model is a critical model to expedite new standards for investments in circular zero-waste projects. Profitable projects will expedite adoption and advance the critical transition from the current ‘take-make-dispose’ paradigm inherent in the energy, forestry and food industries to a more sustainable Bio-Economy paradigm that supports local and rural communities.

Keywords: bio-economy, investment risk, circular design, economic modelling

Procedia PDF Downloads 93
7007 Influence of Confinement on Phase Behavior in Unconventional Gas Condensate Reservoirs

Authors: Szymon Kuczynski

Abstract:

Poland is characterized by the presence of numerous sedimentary basins and hydrocarbon provinces. Since 2006 exploration for hydrocarbons in Poland become gradually more focus on new unconventional targets, particularly on the shale gas potential of the Upper Ordovician and Lower Silurian in the Baltic-Podlasie-Lublin Basin. The first forecast prepared by US Energy Information Administration in 2011 indicated to 5.3 Tcm of natural gas. In 2012, Polish Geological Institute presented its own forecast which estimated maximum reserves on 1.92 Tcm. The difference in the estimates was caused by problems with calculations of the initial amount of adsorbed, as well as free, gas trapped in shale rocks (GIIP - Gas Initially in Place). This value is dependent from sorption capacity, gas saturation and mutual interactions between gas, water, and rock. Determination of the reservoir type in the initial exploration phase brings essential knowledge, which has an impact on decisions related to the production. The study of porosity impact for phase envelope shift eliminates errors and improves production profitability. Confinement phenomenon affects flow characteristics, fluid properties, and phase equilibrium. The thermodynamic behavior of confined fluids in porous media is subject to the basic considerations for industrial applications such as hydrocarbons production. In particular the knowledge of the phase equilibrium and the critical properties of the contained fluid is essential for the design and optimization of such process. In pores with a small diameter (nanopores), the effect of the wall interaction with the fluid particles becomes significant and occurs in shale formations. Nano pore size is similar to the fluid particles’ diameter and the area of particles which flow without interaction with pore wall is almost equal to the area where this phenomenon occurs. The molecular simulation studies have shown an effect of confinement to the pseudo critical properties. Therefore, the critical parameters pressure and temperature and the flow characteristics of hydrocarbons in terms of nano-scale are under the strong influence of fluid particles with the pore wall. It can be concluded that the impact of a single pore size is crucial when it comes to the nanoscale because there is possible the above-described effect. Nano- porosity makes it difficult to predict the flow of reservoir fluid. Research are conducted to explain the mechanisms of fluid flow in the nanopores and gas extraction from porous media by desorption.

Keywords: adsorption, capillary condensation, phase envelope, nanopores, unconventional natural gas

Procedia PDF Downloads 324
7006 Technology Road Mapping in the Fourth Industrial Revolution: A Comprehensive Analysis and Strategic Framework

Authors: Abdul Rahman Hamdan

Abstract:

The Fourth Industrial Revolution (4IR) has brought unprecedented technological advancements that have disrupted many industries worldwide. In keeping up with the technological advances and rapid disruption by the introduction of many technological advancements brought forth by the 4IR, the use of technology road mapping has emerged as one of the critical tools for organizations to leverage. Technology road mapping can be used by many companies to guide them to become more adaptable and anticipate future transformation and innovation, and avoid being redundant or irrelevant due to the rapid changes in technological advancement. This research paper provides a comprehensive analysis of technology road mapping within the context of the 4IR. The objectives of the paper are to provide companies with practical insights and a strategic framework of technology road mapping for them to navigate the fast-changing nature of the 4IR. This study also contributes to the understanding and practice of technology road mapping in the 4IR and, at the same time, provides organizations with the necessary tools and critical insight to navigate the 4IR transformation by leveraging technology road mapping. Based on the literature review and case studies, the study analyses key principles, methodologies, and best practices in technology road mapping and integrates them with the unique characteristics and challenges of the 4IR. The research paper gives the background of the fourth industrial revolution. It explores the disruptive potential of technologies in the 4IR and the critical need for technology road mapping that consists of strategic planning and foresight to remain competitive and relevant in the 4IR era. It also highlights the importance of technology road mapping as an organisation’s proactive approach to align the organisation’s objectives and resources to their technology and product development in meeting the fast-evolving technological 4IR landscape. The paper also includes the theoretical foundations of technology road mapping and examines various methodological approaches, and identifies external stakeholders in the process, such as external experts, stakeholders, collaborative platforms, and cross-functional teams to ensure an integrated and robust technological roadmap for the organisation. Moreover, this study presents a comprehensive framework for technology road mapping in the 4IR by incorporating key elements and processes such as technology assessment, competitive intelligence, risk analysis, and resource allocation. It provides a framework for implementing technology road mapping from strategic planning, goal setting, and technology scanning to road mapping visualisation, implementation planning, monitoring, and evaluation. In addition, the study also addresses the challenges and limitations related to technology roadmapping in 4IR, including the gap analysis. In conclusion of the study, the study will propose a set of practical recommendations for organizations that intend to leverage technology road mapping as a strategic tool in the 4IR in driving innovation and becoming competitive in the current and future ecosystem.

Keywords: technology management, technology road mapping, technology transfer, technology planning

Procedia PDF Downloads 54
7005 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education

Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue

Abstract:

In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.

Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education

Procedia PDF Downloads 93
7004 Improving Security by Using Secure Servers Communicating via Internet with Standalone Secure Software

Authors: Carlos Gonzalez

Abstract:

This paper describes the use of the Internet as a feature to enhance the security of our software that is going to be distributed/sold to users potentially all over the world. By placing in a secure server some of the features of the secure software, we increase the security of such software. The communication between the protected software and the secure server is done by a double lock algorithm. This paper also includes an analysis of intruders and describes possible responses to detect threats.

Keywords: internet, secure software, threats, cryptography process

Procedia PDF Downloads 310
7003 Characterization and Degradation of 3D Printed Polycaprolactone-Freeze Dried Bone Matrix Constructs for Use in Critical Sized Bone Defects

Authors: Samantha Meyr, Eman Mirdamadi, Martha Wang, Tao Lowe, Ryan Smith, Quinn Burke

Abstract:

Critical-sized bone defects (CSD) treatment options remain a major clinical orthopedic challenge. They are uniquely contoured diseased or damaged bones and can be defined as those that will not heal spontaneously and require surgical intervention. Autografts are the current gold standard CSD treatment, which are histocompatible and provoke a minimal immunogenic response; however, they can cause donor site morbidity and will not suffice for the size required for replacement. As an alternative to traditional surgical methods, bone tissue engineering will be implemented via 3D printing methods. A freeze-dried bone matrix (FDBM) is a type of graft material available but will only function as desired when in the presence of bone growth factors. Polycaprolactone (PCL) is a known biodegradable material with good biocompatibility that has been proven manageable in 3D printing as a medical device. A 3D-extrusion printing strategy is introduced to print these materials into scaffolds for bone grafting purposes, which could be more accessible and rapid than the current standard. Mechanical, thermal, cytotoxic, and physical properties were investigated throughout a degradation period of 6 months using fibroblasts and dental pulp stem cells. PCL-FDBM scaffolds were successfully printed with high print fidelity in their respective pore sizes and allograft content. Additionally, we have created a method for evaluating PCL using differential scanning calorimetry (DSC) and have evaluated PCL degradation over roughly 6 months.

Keywords: 3D printing, bone tissue engineering, cytotoxicity, degradation, scaffolds

Procedia PDF Downloads 86
7002 Urban Sprawl Analysis in the City of Thiruvananthapuram and a Framework Formulation to Combat it

Authors: Sandeep J. Kumar

Abstract:

Urbanisation is considered as the primary driver of land use and land cover change that has direct link to population and economic growth. In India, as well as in other developing countries, cities are urbanizing at an alarming rate. This unprecedented and uncontrolled urbanisation can result in urban sprawl. Due to a number of factors, urban sprawl is recognised to be a result of poor planning, inadequate policies, and poor governance. Urban sprawl may be seen as posing a threat to the development of sustainable cities. Hence, it is very essential to manage this. Planning for predicted future growth is critical to avoid the negative effects of urban growth at the local and regional levels. Thiruvananthapuram being the capital city of Kerala is a city of economic success, challenges, and opportunities. Urbanization trends in the city have paved way for Urban Sprawl. This thesis aims to formulate a framework to combat the emerging urban sprawl in the city of Thiruvananthapuram. For that, the first step was to quantify trends of urban growth in Thiruvananthapuram city using Geographical Information System(GIS) and remote sensing techniques. The technique and results obtained in the study are extremely valuable in analysing the land use changes. Secondly, these change in the trends were analysed through some of the critical factors that helped the study to understand the underlying issues of the existing city structure that has resulted in urban sprawl. Anticipating development trends can modify the current order. This can be productively resolved using regional and municipal planning and management strategies. Hence efficient strategies to curb the sprawl in Thiruvananthapuram city have been formulated in this study that can be considered as recommendations for future planning.

Keywords: urbanisation, urban sprawl, geographical information system(GIS), thiruvananthapuram

Procedia PDF Downloads 99
7001 Modeling Visual Memorability Assessment with Autoencoders Reveals Characteristics of Memorable Images

Authors: Elham Bagheri, Yalda Mohsenzadeh

Abstract:

Image memorability refers to the phenomenon where certain images are more likely to be remembered by humans than others. It is a quantifiable and intrinsic attribute of an image. Understanding how visual perception and memory interact is important in both cognitive science and artificial intelligence. It reveals the complex processes that support human cognition and helps to improve machine learning algorithms by mimicking the brain's efficient data processing and storage mechanisms. To explore the computational underpinnings of image memorability, this study examines the relationship between an image's reconstruction error, distinctiveness in latent space, and its memorability score. A trained autoencoder is used to replicate human-like memorability assessment inspired by the visual memory game employed in memorability estimations. This study leverages a VGG-based autoencoder that is pre-trained on the vast ImageNet dataset, enabling it to recognize patterns and features that are common to a wide and diverse range of images. An empirical analysis is conducted using the MemCat dataset, which includes 10,000 images from five broad categories: animals, sports, food, landscapes, and vehicles, along with their corresponding memorability scores. The memorability score assigned to each image represents the probability of that image being remembered by participants after a single exposure. The autoencoder is finetuned for one epoch with a batch size of one, attempting to create a scenario similar to human memorability experiments where memorability is quantified by the likelihood of an image being remembered after being seen only once. The reconstruction error, which is quantified as the difference between the original and reconstructed images, serves as a measure of how well the autoencoder has learned to represent the data. The reconstruction error of each image, the error reduction, and its distinctiveness in latent space are calculated and correlated with the memorability score. Distinctiveness is measured as the Euclidean distance between each image's latent representation and its nearest neighbor within the autoencoder's latent space. Different structural and perceptual loss functions are considered to quantify the reconstruction error. The results indicate that there is a strong correlation between the reconstruction error and the distinctiveness of images and their memorability scores. This suggests that images with more unique distinct features that challenge the autoencoder's compressive capacities are inherently more memorable. There is also a negative correlation between the reduction in reconstruction error compared to the autoencoder pre-trained on ImageNet, which suggests that highly memorable images are harder to reconstruct, probably due to having features that are more difficult to learn by the autoencoder. These insights suggest a new pathway for evaluating image memorability, which could potentially impact industries reliant on visual content and mark a step forward in merging the fields of artificial intelligence and cognitive science. The current research opens avenues for utilizing neural representations as instruments for understanding and predicting visual memory.

Keywords: autoencoder, computational vision, image memorability, image reconstruction, memory retention, reconstruction error, visual perception

Procedia PDF Downloads 61
7000 Criticality of Adiabatic Length for a Single Branch Pulsating Heat Pipe

Authors: Utsav Bhardwaj, Shyama Prasad Das

Abstract:

To meet the extensive requirements of thermal management of the circuit card assemblies (CCAs), satellites, PCBs, microprocessors, any other electronic circuitry, pulsating heat pipes (PHPs) have emerged in the recent past as one of the best solutions technically. But industrial application of PHPs is still unexplored up to a large extent due to their poor reliability. There are several systems as well as operational parameters which not only affect the performance of an operating PHP, but also decide whether the PHP can operate sustainably or not. Functioning may completely be halted for some particular combinations of the values of system and operational parameters. Among the system parameters, adiabatic length is one of the important ones. In the present work, a simplest single branch PHP system with an adiabatic section has been considered. It is assumed to have only one vapour bubble and one liquid plug. First, the system has been mathematically modeled using film evaporation/condensation model, followed by the steps of recognition of equilibrium zone, non-dimensionalization and linearization. Then proceeding with a periodical solution of the linearized and reduced differential equations, stability analysis has been performed. Slow and fast variables have been identified, and averaging approach has been used for the slow ones. Ultimately, temporal evolution of the PHP is predicted by numerically solving the averaged equations, to know whether the oscillations are likely to sustain/decay temporally. Stability threshold has also been determined in terms of some non-dimensional numbers formed by different groupings of system and operational parameters. A combined analytical and numerical approach has been used, and it has been found that for each combination of all other parameters, there exists a maximum length of the adiabatic section beyond which the PHP cannot function at all. This length has been called as “Critical Adiabatic Length (L_ac)”. For adiabatic lengths greater than “L_ac”, oscillations are found to be always decaying sooner or later. Dependence of “L_ac” on some other parameters has also been checked and correlated at certain evaporator & condenser section temperatures. “L_ac” has been found to be linearly increasing with increase in evaporator section length (L_e), whereas the condenser section length (L_c) has been found to have almost no effect on it upto a certain limit. But at considerably large condenser section lengths, “L_ac” is expected to decrease with increase in “L_c” due to increased wall friction. Rise in static pressure (p_r) exerted by the working fluid reservoir makes “L_ac” rise exponentially whereas it increases cubically with increase in the inner diameter (d) of PHP. Physics of all such variations has been given a good insight too. Thus, a methodology for quantification of the critical adiabatic length for any possible set of all other parameters of PHP has been established.

Keywords: critical adiabatic length, evaporation/condensation, pulsating heat pipe (PHP), thermal management

Procedia PDF Downloads 213
6999 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features

Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh

Abstract:

In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.

Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve

Procedia PDF Downloads 245
6998 Lithuanian Sign Language Literature: Metaphors at the Phonological Level

Authors: Anželika Teresė

Abstract:

In order to solve issues in sign language linguistics, address matters pertaining to maintaining high quality of sign language (SL) translation, contribute to dispelling misconceptions about SL and deaf people, and raise awareness and understanding of the deaf community heritage, this presentation discusses literature in Lithuanian Sign Language (LSL) and inherent metaphors that are created by using the phonological parameter –handshape, location, movement, palm orientation and nonmanual features. The study covered in this presentation is twofold, involving both the micro-level analysis of metaphors in terms of phonological parameters as a sub-lexical feature and the macro-level analysis of the poetic context. Cognitive theories underlie research of metaphors in sign language literature in a range of SL. The study follows this practice. The presentation covers the qualitative analysis of 34 pieces of LSL literature. The analysis employs ELAN software widely used in SL research. The target is to examine how specific types of each phonological parameter are used for the creation of metaphors in LSL literature and what metaphors are created. The results of the study show that LSL literature employs a range of metaphors created by using classifier signs and by modifying the established signs. The study also reveals that LSL literature tends to create reference metaphors indicating status and power. As the study shows, LSL poets metaphorically encode status by encoding another meaning in the same sign, which results in creating double metaphors. The metaphor of identity has been determined. Notably, the poetic context has revealed that the latter metaphor can also be identified as a metaphor for life. The study goes on to note that deaf poets create metaphors related to the importance of various phenomena significance of the lyrical subject. Notably, the study has allowed detecting locations, nonmanual features and etc., never mentioned in previous SL research as used for the creation of metaphors.

Keywords: Lithuanian sign language, sign language literature, sign language metaphor, metaphor at the phonological level, cognitive linguistics

Procedia PDF Downloads 121
6997 Dido: An Automatic Code Generation and Optimization Framework for Stencil Computations on Distributed Memory Architectures

Authors: Mariem Saied, Jens Gustedt, Gilles Muller

Abstract:

We present Dido, a source-to-source auto-generation and optimization framework for multi-dimensional stencil computations. It enables a large programmer community to easily and safely implement stencil codes on distributed-memory parallel architectures with Ordered Read-Write Locks (ORWL) as an execution and communication back-end. ORWL provides inter-task synchronization for data-oriented parallel and distributed computations. It has been proven to guarantee equity, liveness, and efficiency for a wide range of applications, particularly for iterative computations. Dido consists mainly of an implicitly parallel domain-specific language (DSL) implemented as a source-level transformer. It captures domain semantics at a high level of abstraction and generates parallel stencil code that leverages all ORWL features. The generated code is well-structured and lends itself to different possible optimizations. In this paper, we enhance Dido to handle both Jacobi and Gauss-Seidel grid traversals. We integrate temporal blocking to the Dido code generator in order to reduce the communication overhead and minimize data transfers. To increase data locality and improve intra-node data reuse, we coupled the code generation technique with the polyhedral parallelizer Pluto. The accuracy and portability of the generated code are guaranteed thanks to a parametrized solution. The combination of ORWL features, the code generation pattern and the suggested optimizations, make of Dido a powerful code generation framework for stencil computations in general, and for distributed-memory architectures in particular. We present a wide range of experiments over a number of stencil benchmarks.

Keywords: stencil computations, ordered read-write locks, domain-specific language, polyhedral model, experiments

Procedia PDF Downloads 114
6996 Toward the Destigmatizing the Autism Label: Conceptualizing Celebratory Technologies

Authors: LouAnne Boyd

Abstract:

From the perspective of self-advocates, the biggest unaddressed problem is not the symptoms of an autism spectrum diagnosis but the social stigma that accompanies autism. This societal perspective is in contrast to the focus on the majority of interventions. Autism interventions, and consequently, most innovative technologies for autism, aim to improve deficits that occur within the person. For example, the most common Human-Computer Interaction research projects in assistive technology for autism target social skills from a normative perspective. The premise of the autism technologies is that difficulties occur inside the body, hence, the medical model focuses on ways to improve the ailment within the person. However, other technological approaches to support people with autism do exist. In the realm of Human Computer Interaction, there are other modes of research that provide critique of the medical model. For example, critical design, whose intended audience is industry or other HCI researchers, provides products that are the opposite of interventionist work to bring attention to the misalignment between the lived experience and the societal perception of autism. For example, parodies of interventionist work exist to provoke change, such as a recent project called Facesavr, a face covering that helps allistic adults be more independent in their emotional processing. Additionally, from a critical disability studies’ perspective, assistive technologies perpetuate harmful normalizing behaviors. However, these critical approaches can feel far from the frontline in terms of taking direct action to positively impact end users. From a critical yet more pragmatic perspective, projects such as Counterventions lists ways to reduce the likelihood of perpetuating ableism in interventionist’s work by reflectively analyzing a series of evolving assistive technology projects through a societal lens, thus leveraging the momentum of the evolving ecology of technologies for autism. Therefore, all current paradigms fall short of addressing the largest need—the negative impact of social stigma. The current work introduces a new paradigm for technologies for autism, borrowing from a paradigm introduced two decades ago around changing the narrative related to eating disorders. It is the shift from reprimanding poor habits to celebrating positive aspects of eating. This work repurposes Celebratory Technology for Neurodiversity and intended to reduce social stigma by targeting for the public at large. This presentation will review how requirements were derived from current research on autism social stigma as well as design sessions with autistic adults. Congruence between these two sources revealed three key design implications for technology: provide awareness of the autistic experience; generate acceptance of the neurodivergence; cultivate an appreciation for talents and accomplishments of neurodivergent people. The current pilot work in Celebratory Technology offers a new paradigm for supporting autism by shifting the burden of change from the person with autism to address changing society’s biases at large. Shifting the focus of research outside of the autistic body creates a new space for a design that extends beyond the bodies of a few and calls on all to embrace humanity as a whole.

Keywords: neurodiversity, social stigma, accessibility, inclusion, celebratory technology

Procedia PDF Downloads 54
6995 A Systematic Analysis of Knowledge Development Trends in Industrial Maintenance Projects

Authors: Lilian Ogechi Iheukwumere-Esotu, Akilu Yunusa-Kaltungo, Paul Chan

Abstract:

Industrial assets are prone to degradation and eventual failures due to repetitive loads and harsh environments in which they operate. These failures often lead to costly downtimes, which may involve loss of critical assets and/or human lives. The rising pressures from stakeholders for optimized systems’ outputs have further placed strains on business organizations. Traditional means of combating such failures are by adopting strategies capable of predicting, controlling, and/or reducing the likelihood of systems’ failures. Turnarounds, shutdowns, and outages (TSOs) projects are popular maintenance management activities conducted over a certain period of time. However, despite the critical and significant cost implications of TSOs, the management of the interface of knowledge between academia and industry to our best knowledge has not been fully explored in comparison to other aspects of industrial operations. This is perhaps one of the reasons for the limited knowledge transfer between academia and industry, which has affected the outcomes of most TSOs. Prior to now, the study of knowledge development trends as a failure analysis tool in the management of TSOs projects have not gained the required level of attention. Hence, this review provides useful references and their implications for future studies in this field. This study aims to harmonize the existing research trends of TSOs through a systematic review of more than 3,000 research articles published over 7 decades (1940- till date) which were extracted using very specific research criteria and later streamlined using nominated inclusion and exclusion parameters. The information obtained from the analysis were then synthesized and coded into 8 parameters, thereby allowing for a transformation into actionable outputs. The study revealed a variety of information, but the most critical findings can be classified into 4 folds: (1) Empirical validation of available conceptual frameworks and models is still a far cry in practice, (2) traditional project management views for managing uncertainties are still dominant, (3) Inconsistent approaches towards the adoption and promotion of knowledge management systems which supports creation, transfer and application of knowledge within and outside the project organization and, (4) exploration of social practices in industrial maintenance project environments are under-represented within the existing body of knowledge. Thus, the intention of this study is to depict the usefulness of a framework which incorporates fact findings emanating from careful analysis and illustrations of evidence based results as a suitable approach which can tackle reoccurring failures in industrial maintenance projects.

Keywords: industrial maintenance, knowledge management, maintenance projects, systematic review, TSOs

Procedia PDF Downloads 106
6994 A Theoretical Study on Pain Assessment through Human Facial Expresion

Authors: Mrinal Kanti Bhowmik, Debanjana Debnath Jr., Debotosh Bhattacharjee

Abstract:

A facial expression is undeniably the human manners. It is a significant channel for human communication and can be applied to extract emotional features accurately. People in pain often show variations in facial expressions that are readily observable to others. A core of actions is likely to occur or to increase in intensity when people are in pain. To illustrate the changes in the facial appearance, a system known as Facial Action Coding System (FACS) is pioneered by Ekman and Friesen for human observers. According to Prkachin and Solomon, a set of such actions carries the bulk of information about pain. Thus, the Prkachin and Solomon pain intensity (PSPI) metric is defined. So, it is very important to notice that facial expressions, being a behavioral source in communication media, provide an important opening into the issues of non-verbal communication in pain. People express their pain in many ways, and this pain behavior is the basis on which most inferences about pain are drawn in clinical and research settings. Hence, to understand the roles of different pain behaviors, it is essential to study the properties. For the past several years, the studies are concentrated on the properties of one specific form of pain behavior i.e. facial expression. This paper represents a comprehensive study on pain assessment that can model and estimate the intensity of pain that the patient is suffering. It also reviews the historical background of different pain assessment techniques in the context of painful expressions. Different approaches incorporate FACS from psychological views and a pain intensity score using the PSPI metric in pain estimation. This paper investigates in depth analysis of different approaches used in pain estimation and presents different observations found from each technique. It also offers a brief study on different distinguishing features of real and fake pain. Therefore, the necessity of the study lies in the emerging fields of painful face assessment in clinical settings.

Keywords: facial action coding system (FACS), pain, pain behavior, Prkachin and Solomon pain intensity (PSPI)

Procedia PDF Downloads 328
6993 Critical Discourse Analysis Approach to the Post-Feminist Representations in Ecommerce Livestreamings of Lipsticks

Authors: Haiyan Huang, Jan Blommaert, Ellen Van Praet

Abstract:

The embrace of neoliberal economic system in China has engendered the entry of global commodity capitalism into domestic Chinese market and ushered in the post-feminism that is closely associated with consumerism from western culture. Chinese women are instilled and thus hold the belief of empowering themselves and expressing their individualism through consumption. To unravel the consumerist ideologies embedded in Li’s discursive practices, we rely on critical discourse analysis (CDA) as our research framework. The data analyses suggest that cosmopolitanism and class are two repeating themes when Li engages in persuading consumerist behaviors from the female audience. Through hints and cues such as “going on business trips”, “traveling abroad”, “international brands” and among others, Li provides the access to and possibility of imagining cosmopolitan and middle class identity for his audience. Such yearning for western culture and global citizen identity also implicates the aspiration for a well-off socioeconomic status, proving that post-feminism in China not only embodies western consumerism but also implicates the struggle of class movement. These defining elements of choice and freedom are well-situated in contemporary Chinese society where women are enjoying more educational and economic independence than before. However a closer examination reveals conflicts between hegemonic discourse of post-feminism and the status quo. First, propagating women’s power through consumption obscure the entrenched gender inequality in China. Philosophies such as employment discrimination, equal payment, education right, etc., the cornerstones of feminism did not exist in China, leading to historical gender issues unsolved. Second, the lengthy broadcastings (which normally last more than 2 hours) featured with big discounts on products beg the question who are the real audience of ecommerce livestreaming. Seemingly addressing to young well-off Chinese females, Li’s discursive practice can be targeting at young but not wealthy girls who aspire to mimic the lifestyle of middle class women. By selling the idea of empowering and constructing identity through consuming beauty products (e.g., lipsticks), capitalists are endeavoring to create the post-feminism illusion and cause anxieties among Chinese females. Through in-depth analyses of hegemonic discourse on ecommerce livestreaming of lipsticks, the paper contributes to a better understanding of post-feminism in contemporary China and meanwhile illustrates the problems Chinese women face in securing power and equality.

Keywords: Chinese women, critical discourse analysis, ecommerce livestreaming, post-feminism

Procedia PDF Downloads 115
6992 Phytomining for Rare Earth Elements: A Comparative Life Cycle Assessment

Authors: Mohsen Rabbani, Trista McLaughlin, Ehsan Vahidi

Abstract:

the remediation of polluted sites with heavy metals, such as rare earth elements (REEs), has been a primary concern of researchers to decontaminate the soil. Among all developed methods to address this concern, phytoremediation has been established as efficient, cost-effective, easy-to-use, and environmentally friendly way, providing a long-term solution for addressing this global concern. Furthermore, this technology has another great potential application in the metals production sector through returning metals buried in soil via metals cropping. Considering the significant metal concentration in hyper-accumulators, the utilization of bioaccumulated metals to extract metals from plant matter has been proposed as a sub-economic area called phytomining. As a recent, more advanced technology to eliminate such pollutants from the soil and produce critical metals, bioharvesting (phytomining/agromining) has been considered another compromising way to produce metals and meet the global demand for critical/target metals. The bio-ore obtained from phytomining can be safely disposed of or introduced to metal production pathways to obtain the most demanded metals, such as REEs. It is well-known that some hyperaccumulators, e.g., fern Dicranopteris linearis, can be used to absorb REE metals from the polluted soils and accumulate them in plant organs, such as leaves and stems. After soil remediation, the plant species can be harvested and introduced to the downstream steps, namely crushing/grinding, leaching, and purification processes, to extract REEs from plant matter. This novel interdisciplinary field can fill the gap between agriculture, mining, metallurgy, and the environment. Despite the advantages of agromining for the REEs production industry, key issues related to the environmental sustainability of the entire life cycle of this new concept have not been assessed yet. Hence, a comparative life cycle assessment (LCA) study was conducted to quantify the environmental footprints of REEs phytomining. The current LCA study aims to estimate and calculate environmental effects associated with phytomining by considering critical factors, such as climate change, land use, and ozone depletion. The results revealed that phytomining is an easy-to-use and environmentally sustainable approach to either eliminate REEs from polluted sites or produce REEs, offering a new source of such metals production. This LCA research provides guidelines for researchers active in developing a reliable relationship between agriculture, mining, metallurgy, and the environment to encounter soil pollution and keep the earth green and clean.

Keywords: phytoremediation, phytomining, life cycle assessment, environmental impacts, rare earth elements, hyperaccumulator

Procedia PDF Downloads 54
6991 Patient Engagement in Healthcare and Health Literacy in China: A Survey in China

Authors: Qing Wu, Xuchun Ye, Qiuchen Wang, Kirsten Corazzini

Abstract:

Objective: It’s increasing acknowledged that patient engagement in healthcare and health literacy both have positive impact on patient outcome. Health literacy emphasizes the ability of individuals to understand and apply health information and manage health. Patients' health literacy affected their willingness to participate in decision-making, but its impact on the behavior and willingness of patient engagement in healthcare is not clear, especially in China. Therefore, this study aimed to explore the correlation between the behavior and willingness of patient engagement and health literacy. Methods: A cross-sectional survey was employed using the behavior and willingness of patient engagement in healthcare questionnaire, Chinese version All Aspects of Health Literacy Scale (AAHLS). A convenient sample of 443 patients was recruited from 8 general hospitals in Shanghai, Jiangsu Province and Zhejiang Province, from September 2016 to January 2017. Results: The mean score for the willingness was (4.41±0.45), and the mean score for the patient engagement behavior was (4.17±0.49); the mean score for the patient's health literacy was (2.36±0.29),the average score of its three dimensions- the functional literacy, the Communicative/interactive literacy and the Critical literacy, was (2.26±0.38), (2.28±0.42), and (2.61±0.43), respectively. Patients' health literacy was positively correlated with their willingness of engagement (r = 0.367, P < 0.01), and positively correlated with patient engagement behavior (r = 0.357, P < 0.01). All dimensions of health literacy were positively correlated with the behavior and willingness of patient engagement in healthcare; the dimension of Communicative/interactive literacy (r = 0.312, P < 0.01; r = 0.357, P < 0.01) and the Critical literacy (r = 0.357, P < 0.01; r = 0.357, P < 0.01) are more relevant to the behavior and willingness than the dimension of basic/functional literacy (r=0.150, P < 0.01; r = 0.150, P < 0.01). Conclusions: The behavior and willingness of patient engagement in healthcare are positively correlated with health literacy and its dimensions. In clinical work, medical staff should pay attention to patients’ health literacy, especially the situation that low literacy leads to low participation and provide health information to patients through health education or communication to improve their health literacy as well as guide them to actively and rationally participate in their own health care.

Keywords: patient engagement, health literacy, healthcare, correlation

Procedia PDF Downloads 150