Search results for: formal approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14293

Search results for: formal approach

11293 Analyzing the Impact of Local and International Artists in Creating Cultural Identity through Public Art: Case Study of Chicago Public Policies

Authors: Kaesha M. Freyaldenhoven

Abstract:

Chicago is a city in the United States whose cultural identity is largely shaped by public art pieces. Quintessential public works created by internationally renown artists – such as Anish Kapoor’s Cloud Gate in Millennium Park and 'The Picasso' in Daley Plaza – have historically contributed to developing a shared sense of community. In 2017, the city implemented a policy titled 50x50 Neighborhood Arts Project under the Chicago Public Art Plan. The policy promotes investments in contemporary public art to elevate neighborhood cultural assets and create a sense of place. Exclusively community-based artists were commissioned to accomplish the mission of the policy. Administrators felt only local artists would be capable of capturing the true essence of a neighborhood through art. This paper discusses the relationship between the public art and the culture of its respective neighborhood through close examination of aesthetic formal properties and social significance. Research compares the role of international artists with the role of local artists in cultivating the identity of a city through site-specific artworks in Chicago. Methodology unites theoretical research on understanding art and its function in the public space with empirical research on Chicago-based works. Theoretical frameworks provide an art historical foundation to explore the manner in which physical properties convey meaning through the work itself and its placement in an urban setting. Empirical research that examines policy documentation and press announcements released by the Department of Cultural Affairs and Special Events investigates project selection processes pertaining to the artists and neighborhoods. Ethnographies and interviews of individuals from diverse social segments in contemporary Chicago society measure impacts of the works on respective populations. Findings demonstrate works created by local artists activate neighborhoods and inculcate a sense of pride among community residents. Works created by international artists garner widespread media attention that frames the city’s cultural identity across temporal and geographic zones. This research can be utilized to inform future cultural policies pertaining to the commission of public art.

Keywords: Chicago, cultural policy, public art, urban art

Procedia PDF Downloads 122
11292 Investigation on Ultrahigh Heat Flux of Nanoporous Membrane Evaporation Using Dimensionless Lattice Boltzmann Method

Authors: W. H. Zheng, J. Li, F. J. Hong

Abstract:

Thin liquid film evaporation in ultrathin nanoporous membranes, which reduce the viscous resistance while still maintaining high capillary pressure and efficient liquid delivery, is a promising thermal management approach for high-power electronic devices cooling. Given the challenges and technical limitations of experimental studies for accurate interface temperature sensing, complex manufacturing process, and short duration of membranes, a dimensionless lattice Boltzmann method capable of restoring thermophysical properties of working fluid is particularly derived. The evaporation of R134a to its pure vapour ambient in nanoporous membranes with the pore diameter of 80nm, thickness of 472nm, and three porosities of 0.25, 0.33 and 0.5 are numerically simulated. The numerical results indicate that the highest heat transfer coefficient is about 1740kW/m²·K; the highest heat flux is about 1.49kW/cm² with only about the wall superheat of 8.59K in the case of porosity equals to 0.5. The dissipated heat flux scaled with porosity because of the increasing effective evaporative area. Additionally, the self-regulation of the shape and curvature of the meniscus under different operating conditions is also observed. This work shows a promising approach to forecast the membrane performance for different geometry and working fluids.

Keywords: high heat flux, ultrathin nanoporous membrane, thin film evaporation, lattice Boltzmann method

Procedia PDF Downloads 157
11291 Safer Staff: A Survey of Staff Experiences of Violence and Aggression at Work in Coventry and Warwickshire Partnership National Health Service Trust

Authors: Rupinder Kaler, Faith Ndebele, Nadia Saleem, Hafsa Sheikh

Abstract:

Background: Workplace related violence and aggression seems to be considered an acceptable occupational hazard for staff in mental health services. There is literature evidence that healthcare workers in mental health settings are at higher risk from aggression from patients. Aggressive behaviours pose a physical and psychological threat to the psychiatric staff and can result in stress, burnout, sickness, and exhaustion. Further evidence informs that health professionals are the most exposed to psychological disorders such as anxiety, depression and post-traumatic stress disorder. Fear that results from working in a dangerous environment and exhaustion can have a damaging impact on patient care and healthcare relationship. Aim: The aim of this study is to investigate the prevalence and impact of aggressive behaviour on staff working at Coventry and Warwickshire Partnership Trust. Methodology: The study methodology included carrying out a manual, anonymised, multi-disciplinary cross-sectional survey questionnaire across all clinical and non-clinical staff at CWPT from both inpatient and community settings. Findings: The unsurprising finding was that of higher prevalence of aggressive behaviours in in-patients in comparison to community staff. Conclusion: There is a high rate of verbal and physical aggression at work and this has a negative impact on the staff emotional and physical well- being. There is also a higher reliance on colleagues for support on an informal basis than formal organisational support systems. Recommendations: A workforce that is well and functioning is the biggest resource for an organisation. Staff safety during working hours is everyone's responsibility and sits with both individual staff members and the organisation. Post-incident organisational support needs to be consolidated, and hands-on, timely support offered to help maintain emotionally well staff on CWPT. The authors recommend development of preventative and practical protocols for aggression with patient and carer involvement. Post-incident organisational support needs to be consolidated, and hands-on, timely support offered to help maintain emotionally well staff on CWPT.

Keywords: safer staff, survey of staff experiences, violence and aggression, mental health

Procedia PDF Downloads 200
11290 Going beyond Stakeholder Participation

Authors: Florian Engel

Abstract:

Only with a radical change to an intrinsically motivated project team, through giving the employees the freedom for autonomy, mastery and purpose, it is then possible to develop excellent products. With these changes, combined with using a rapid application development approach, the group of users serves as an important indicator to test the market needs, rather than only as the stakeholders for requirements.

Keywords: intrinsic motivation, requirements elicitation, self-directed work, stakeholder participation

Procedia PDF Downloads 334
11289 Statistical Pattern Recognition for Biotechnological Process Characterization Based on High Resolution Mass Spectrometry

Authors: S. Fröhlich, M. Herold, M. Allmer

Abstract:

Early stage quantitative analysis of host cell protein (HCP) variations is challenging yet necessary for comprehensive bioprocess development. High resolution mass spectrometry (HRMS) provides a high-end technology for accurate identification alongside with quantitative information. Hereby we describe a flexible HRMS assay platform to quantify HCPs relevant in microbial expression systems such as E. Coli in both up and downstream development by means of MVDA tools. Cell pellets were lysed and proteins extracted, purified samples not further treated before applying the SMART tryptic digest kit. Peptides separation was optimized using an RP-UHPLC separation platform. HRMS-MSMS analysis was conducted on an Orbitrap Velos Elite applying CID. Quantification was performed label-free taking into account ionization properties and physicochemical peptide similarities. Results were analyzed using SIEVE 2.0 (Thermo Fisher Scientific) and SIMCA (Umetrics AG). The developed HRMS platform was applied to an E. Coli expression set with varying productivity and the corresponding downstream process. Selected HCPs were successfully quantified within the fmol range. Analysing HCP networks based on pattern analysis facilitated low level quantification and enhanced validity. This approach is of high relevance for high-throughput screening experiments during upstream development, e.g. for titer determination, dynamic HCP network analysis or product characterization. Considering the downstream purification process, physicochemical clustering of identified HCPs is of relevance to adjust buffer conditions accordingly. However, the technology provides an innovative approach for label-free MS based quantification relying on statistical pattern analysis and comparison. Absolute quantification based on physicochemical properties and peptide similarity score provides a technological approach without the need of sophisticated sample preparation strategies and is therefore proven to be straightforward, sensitive and highly reproducible in terms of product characterization.

Keywords: process analytical technology, mass spectrometry, process characterization, MVDA, pattern recognition

Procedia PDF Downloads 244
11288 Other End of the Leash: The Volunteer Handlers Perspective of Animal-Assisted Interventions

Authors: Julie A. Carberry, Victor Maddalena

Abstract:

Animal-Assisted Interventions (AAIs) have existed in various forms for centuries. In the past 30 years, there has been a dramatic increase in popularity. AAIs are now part of the lives of persons of all ages in many types of institutions. Anecdotal evidence of the benefits of AAIs have led to widespread adoption, yet there remains a lack of solid research base for support. The research question was, what are the lived experiences of AAI volunteer handlers are? An interpretive phenomenological methodology was used for this qualitative study. Data were collected from 1 - 2 hour-long semi-structured interviews and 1 observational field visit. All interviews were conducted, transcribed, and coded for themes by the principal investigator. Participants must have been an active St. John Ambulance Therapy Dog Program volunteer for a least one year. In total, 14 volunteer handlers, along with some of their dogs, were included. The St. John Ambulance is a not for profit organization that provides training and community services to Canadians. The Therapy Dog Program is 1 of the 4 nationally recognized core community service programs. The program incorporates dogs in the otherwise traditional therapeutic intervention of friendly visitation with clients. The lack of formal objectives and goals, and a trained therapist defines the program as an Animal-Assisted Activity (AAA), which is a type of AAI. Since the animals incorporated are dogs, the program is specifically a Canine-Assisted Activity (CAA), which is a type of Canine-Assisted Intervention (CAI). Six themes emerged from the analysis of the data: (a) a win-win-win situation for all parties involved – volunteer handlers, clients, and the dogs, (b) being on the other end of the leash: functions of the role of volunteer handler, (c) the importance of socialization: from spreading smiles to creating meaningful connections, (d) the role of the dog: initiating interaction and providing comfort, (e) an opportunity to feel good and destress, and (f) altruism versus personal rewards. Other insights were found regarding the program, clients, and staff. Possible implications from this research include increased organizational recruitment and retention of volunteer handlers and as well as increased support for CAAs and other CAIs that incorporate teams of volunteer handlers and their dogs. This support could, in turn, add overall support for the acceptance and broad implementation of AAIs as an alternative and or complementary non-pharmaceutical therapeutic intervention.

Keywords: animal-assisted activity, animal-assisted intervention, canine-assisted activity, canine-assisted intervention, perspective, qualitative, volunteer handler

Procedia PDF Downloads 138
11287 Comparison of Number of Waves Surfed and Duration Using Global Positioning System and Inertial Sensors

Authors: João Madureira, Ricardo Lagido, Inês Sousa, Fraunhofer Portugal

Abstract:

Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.

Keywords: inertial measurement unit (IMU), global positioning system (GPS), smartphone, surfing performance

Procedia PDF Downloads 397
11286 Phytoremediation Waste Processing of Coffee in Various Concentration of Organic Materials Plant Using Kiambang

Authors: Siti Aminatu Zuhria

Abstract:

On wet coffee processing can improve the quality of coffee, but the coffee liquid waste that can pollute the environment. Liquid waste a lot of coffee resulting from the stripping and washing the coffee. This research will be carried out the process of handling liquid waste stripping coffee from the coffee skin with media phytoremediation using plants kiambang. The purpose of this study was to determine the characteristics of the coffee liquid waste and plant phytoremediation kiambang as agent in various concentrations of liquid waste coffee as well as determining the most optimal concentration in the improved quality of waste water quality standard approach. This research will be conducted through two stages, namely the preliminary study and the main study. In a preliminary study aims to determine the ability of the plant life kiambang as phytoremediation agent in the media well water, distilled water and liquid waste coffee. The main study will be conducted wastewater dilution and coffee will be obtained COD concentration variations. Results are expected at this research that can determine the ability of plants kiambang as an agent for phytoremediation in wastewater treatment with various concentrations of waste and the most optimal concentration in the improved quality of waste water quality standard approach.

Keywords: wet coffee processing, phytoremediation, Kiambang plant, variation concentration liquid waste

Procedia PDF Downloads 301
11285 The Reasons for Failure in Writing Essays: Teaching Writing as a Project-Based Enterprise

Authors: Ewa Toloczko

Abstract:

Studies show that developing writing skills throughout years of formal foreign language instruction does not necessarily result in rewarding accomplishments among learners, nor an affirmative attitude they build towards written assignments. What causes this apparently wide-spread bias to writing might be a diminished relevance students attach to it, as opposed to the other productive skill — speaking, insufficient resources available for them to succeed, or the ways writing is approached by instructors, that is inapt teaching techniques that discourage rather that inflame learners’ engagement. The assumption underlying this presentation is that psychological and psycholinguistic factors constitute a key dimension of every writing process, and hence should be seriously considered in both material design and lesson planning. The author intends to demonstrate research in which writing tasks were conceived of as attitudinal rather than technical operations, and consequently turned into meaningful and socially-oriented incidents that students could relate to and have an active hand in. The instrument employed to achieve this purpose and to make writing even more interactive was the format of a project, a carefully devised series of tasks, which involved students as human beings, not only language learners. The projects rested upon the premise that the presence of peers and the teacher in class could be taken advantage of in a supportive rather than evaluative mode. In fact, the research showed that collaborative work and constant meaning negotiation reinforced not only bonds between learners, but also the language form and structure of the output. Accordingly, the role of the teacher shifted from the assessor to problem barometer, always ready to accept the slightest improvements in students’ language performance. This way, written verbal communication, which usually aims to merely manifest accuracy and coherent content for assessment, became part of the enterprise meant to emphasise its social aspect — the writer in real-life setting. The samples of projects show the spectrum of possibilities teachers have when exploring the domain of writing within school curriculum. The ideas are easy to modify and adjust to all proficiency levels and ages. Initially, however, they were meant to suit teenage and young adult learners of English as a foreign language in both European and Asian contexts.

Keywords: projects, psycholinguistic/ psychological dimension of writing, writing as a social enterprise, writing skills, written assignments

Procedia PDF Downloads 228
11284 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm

Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn

Abstract:

Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.

Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct

Procedia PDF Downloads 221
11283 Aspiring to Achieve a Fairer Society

Authors: Bintou Jobe

Abstract:

Background: The research is focused on the concept of equality, diversity, and inclusion (EDI) and the need to achieve equity by treating individuals according to their circumstances and needs. The research is rooted in the UK Equality Act 2010, which emphasizes the importance of equal opportunities for all individuals regardless of their background and social life. However, inequality persists in society, particularly for those from minority backgrounds who face discrimination. Research Aim: The aim of this research is to promote equality, diversity, and inclusion by encouraging the regeneration of minds and the eradication of stereotypes. The focus is on promoting good Equality, Diversity and Inclusion practices in various settings, including schools, colleges, universities, and workplaces, to create environments where every individual feels a sense of belonging. Methodology: The research utilises a literature review approach to gather information on promoting inclusivity, diversity, and inclusion. Findings: The research highlights the significance of promoting equality, diversity, and inclusion practices to ensure that individuals receive the respect and dignity they deserve. It emphasises the importance of treating individuals based on their unique circumstances and needs rather than relying on stereotypes. The research also emphasises the benefits of diversity and inclusion in enhancing innovation, creativity, and productivity. The theoretical importance of this research is to raise awareness about the importance of regenerating minds, challenging stereotypes, and promoting equality, diversity, and inclusion. The emphasis is on treating individuals based on their circumstances and needs rather than relying on generalizations. Diversity and inclusion are beneficial in different settings, as highlighted by the research. By raising awareness about the importance of mind regeneration, eradicating stereotypes, and promoting equality, diversity, and inclusion, this research makes a significant contribution to the subject area. It emphasizes the necessity of treating individuals based on their unique circumstances instead of relying on generalizations. However, the methodology could be strengthened by incorporating primary research to complement the literature review approach. Data Collection and Analysis Procedures: The research utilised a literature review approach to gather relevant information on promoting inclusivity, diversity, and inclusion. NVivo software application was used to analysed and synthesize the findings to identify themes and support the research aim and objectives. Question Addressed: This research addresses the question of how to promote inclusivity, diversity, and inclusion and reduce the prevalence of stereotypes and prejudice. It explores the need to treat individuals based on their unique circumstances and needs rather than relying on generic assumptions. Encourage individuals to adopt a more inclusive approach. Provide managers with responsibility and training that helps them understand the importance of their roles in shaping the workplace culture. Have an equality, diversity, and inclusion manager from a majority background at the senior level who can speak up for underrepresented groups and flag any issues that need addressing. Conclusion: The research emphasizes the importance of promoting equality, diversity, and inclusion practices to create a fairer society. It highlights the need to challenge stereotypes, treat individuals according to their circumstances and needs, and promote a culture of respect and dignity.

Keywords: equality, fairer society, inclusion, diversity

Procedia PDF Downloads 44
11282 Automated Evaluation Approach for Time-Dependent Question Answering Pairs on Web Crawler Based Question Answering System

Authors: Shraddha Chaudhary, Raksha Agarwal, Niladri Chatterjee

Abstract:

This work demonstrates a web crawler-based generalized end-to-end open domain Question Answering (QA) system. An efficient QA system requires a significant amount of domain knowledge to answer any question with the aim to find an exact and correct answer in the form of a number, a noun, a short phrase, or a brief piece of text for the user's questions. Analysis of the question, searching the relevant document, and choosing an answer are three important steps in a QA system. This work uses a web scraper (Beautiful Soup) to extract K-documents from the web. The value of K can be calibrated on the basis of a trade-off between time and accuracy. This is followed by a passage ranking process using the MS-Marco dataset trained on 500K queries to extract the most relevant text passage, to shorten the lengthy documents. Further, a QA system is used to extract the answers from the shortened documents based on the query and return the top 3 answers. For evaluation of such systems, accuracy is judged by the exact match between predicted answers and gold answers. But automatic evaluation methods fail due to the linguistic ambiguities inherent in the questions. Moreover, reference answers are often not exhaustive or are out of date. Hence correct answers predicted by the system are often judged incorrect according to the automated metrics. One such scenario arises from the original Google Natural Question (GNQ) dataset which was collected and made available in the year 2016. Use of any such dataset proves to be inefficient with respect to any questions that have time-varying answers. For illustration, if the query is where will be the next Olympics? Gold Answer for the above query as given in the GNQ dataset is “Tokyo”. Since the dataset was collected in the year 2016, and the next Olympics after 2016 were in 2020 that was in Tokyo which is absolutely correct. But if the same question is asked in 2022 then the answer is “Paris, 2024”. Consequently, any evaluation based on the GNQ dataset will be incorrect. Such erroneous predictions are usually given to human evaluators for further validation which is quite expensive and time-consuming. To address this erroneous evaluation, the present work proposes an automated approach for evaluating time-dependent question-answer pairs. In particular, it proposes a metric using the current timestamp along with top-n predicted answers from a given QA system. To test the proposed approach GNQ dataset has been used and the system achieved an accuracy of 78% for a test dataset comprising 100 QA pairs. This test data was automatically extracted using an analysis-based approach from 10K QA pairs of the GNQ dataset. The results obtained are encouraging. The proposed technique appears to have the possibility of developing into a useful scheme for gathering precise, reliable, and specific information in a real-time and efficient manner. Our subsequent experiments will be guided towards establishing the efficacy of the above system for a larger set of time-dependent QA pairs.

Keywords: web-based information retrieval, open domain question answering system, time-varying QA, QA evaluation

Procedia PDF Downloads 99
11281 Consent and the Construction of Unlawfulness

Authors: Susanna Menis

Abstract:

The context of this study revolves around the theme of consent and the construction of unlawfulness in judicial decisions. It aims to explore the formation of societal perceptions of unlawfulness within the context of consensual sexual acts leading to harmful consequences. This study investigates how judges create legal rules that reflect social solidarity and protect against violence. Specifically, the research aims to understand the justification behind criminalising consensual sexual activity when categorised under different offences. The main question addressed in this study will evaluate the way judges create legal rules that they believe reflect social solidarity and protect against violence. The study employs a historical genealogy approach as its methodology. This approach allows for tracing back the original formation of societal perspectives on unlawfulness, thus highlighting the socially constructed nature of the present understanding. The data for this study will be collected through an extensive literature review, examining historical legal cases and documents that shape the understanding of unlawfulness. This will provide a comprehensive view of how social attitudes toward private sexual relations influenced the creation of legal rules. The theoretical importance of this research lies in its contribution to socio-legal scholarship. This study adds to the existing knowledge on the topic by exploring questions of unconscious bias and its origins. The findings shed light on how and why individuals possess unconscious biases, particularly within the judicial system. In conclusion, this study investigates judicial decisions concerning consensual sexual acts and the construction of unlawfulness. By employing a historical genealogy approach, the research sheds light on how judges create legal rules that reflect social solidarity and aim to protect against violence. The theoretical importance of this study lies in its contribution to understanding unconscious bias and its origins within the judicial system. Through data collection and analysis procedures, this study aims to provide valuable insights into the formation of social attitudes towards private sexual relations and its impact on legal rulings.

Keywords: consent, sexual offences, offences against the person, legal genealogy, social construct

Procedia PDF Downloads 57
11280 Decision Framework for Cross-Border Railway Infrastructure Projects

Authors: Dimitrios J. Dimitriou, Maria F. Sartzetaki

Abstract:

Transport infrastructure assets are key components of the national asset portfolio. The decision to invest in a new infrastructure in transports could take from a few years to some decades. This is mainly because of the need to reserve and spent many capitals, the long payback period, the number of the stakeholders involved in decision process and –many times- the investment and business risks are high. Therefore, the decision assessment framework is an essential challenge linked with the key decision factors meet the stakeholder expectations highlighting project trade-offs, financial risks, business uncertainties and market limitations. This paper examines the decision process for new transport infrastructure projects in cross border regions, where a wide range of stakeholders with different expectation is involved. According to a consequences analysis systemic approach, the relationship of transport infrastructure development, economic system development and stakeholder expectation is analyzed. Adopting the on system of system methodological approach, the decision making framework, variables, inputs and outputs are defined, highlighting the key shareholder’s role and expectations. The application provides the methodology outputs presenting the proposed decision framework for a strategic railway project in north Greece deals with the upgrade of the existing railway corridor connecting Greece, Turkey and Bulgaria.

Keywords: decision making, system of system, cross-border, infrastructure project

Procedia PDF Downloads 311
11279 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis

Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen

Abstract:

Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.

Keywords: hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection

Procedia PDF Downloads 302
11278 Machine learning Assisted Selective Emitter design for Solar Thermophotovoltaic System

Authors: Ambali Alade Odebowale, Andargachew Mekonnen Berhe, Haroldo T. Hattori, Andrey E. Miroshnichenko

Abstract:

Solar thermophotovoltaic systems (STPV) have emerged as a promising solution to overcome the Shockley-Queisser limit, a significant impediment in the direct conversion of solar radiation into electricity using conventional solar cells. The STPV system comprises essential components such as an optical concentrator, selective emitter, and a thermophotovoltaic (TPV) cell. The pivotal element in achieving high efficiency in an STPV system lies in the design of a spectrally selective emitter or absorber. Traditional methods for designing and optimizing selective emitters are often time-consuming and may not yield highly selective emitters, posing a challenge to the overall system performance. In recent years, the application of machine learning techniques in various scientific disciplines has demonstrated significant advantages. This paper proposes a novel nanostructure composed of four-layered materials (SiC/W/SiO2/W) to function as a selective emitter in the energy conversion process of an STPV system. Unlike conventional approaches widely adopted by researchers, this study employs a machine learning-based approach for the design and optimization of the selective emitter. Specifically, a random forest algorithm (RFA) is employed for the design of the selective emitter, while the optimization process is executed using genetic algorithms. This innovative methodology holds promise in addressing the challenges posed by traditional methods, offering a more efficient and streamlined approach to selective emitter design. The utilization of a machine learning approach brings several advantages to the design and optimization of a selective emitter within the STPV system. Machine learning algorithms, such as the random forest algorithm, have the capability to analyze complex datasets and identify intricate patterns that may not be apparent through traditional methods. This allows for a more comprehensive exploration of the design space, potentially leading to highly efficient emitter configurations. Moreover, the application of genetic algorithms in the optimization process enhances the adaptability and efficiency of the overall system. Genetic algorithms mimic the principles of natural selection, enabling the exploration of a diverse range of emitter configurations and facilitating the identification of optimal solutions. This not only accelerates the design and optimization process but also increases the likelihood of discovering configurations that exhibit superior performance compared to traditional methods. In conclusion, the integration of machine learning techniques in the design and optimization of a selective emitter for solar thermophotovoltaic systems represents a groundbreaking approach. This innovative methodology not only addresses the limitations of traditional methods but also holds the potential to significantly improve the overall performance of STPV systems, paving the way for enhanced solar energy conversion efficiency.

Keywords: emitter, genetic algorithm, radiation, random forest, thermophotovoltaic

Procedia PDF Downloads 58
11277 Analysis of Crisis Management Systems of United Kingdom and Turkey

Authors: Recep Sait Arpat, Hakan Güreşci

Abstract:

Emergency, disaster and crisis management terms are generally perceived as the same processes. This conflict effects the approach and delegating policy of the political order. Crisis management starts in the aftermath of the mismanagement of disaster and emergency. In the light of the information stated above in this article Turkey and United Kingdom(UK)’s crisis management systems are analyzed. This article’s main aim is to clarify the main points of the emergency management system of United Kingdom and Turkey’s disaster management system by comparing them. To do this: A prototype model of the political decision making processes of the countries is drawn, decision making mechanisms and the planning functions are compared. As a result it’s found that emergency management policy in Turkey is reactive whereas it’s proactive in UK; as the delegating policy Turkey’s system is similar to UK; levels of emergency situations are similar but not the same; the differences are stemming from the civil order and nongovernmental organizations effectiveness; UK has a detailed government engagement model to emergencies, which shapes the doctrine of the approach to emergencies, and it’s successful in gathering and controlling the whole state’s efforts; crisis management is a sub-phase of UK emergency management whereas it’s accepted as a outmoded management perception and the focal point of crisis management perception in UK is security crisis and natural disasters while in Turkey it is natural disasters. In every anlysis proposals are given to Turkey.

Keywords: crisis management, disaster management, emergency management, turkey, united kingdom

Procedia PDF Downloads 368
11276 Computational Approach for Grp78–Nf-ΚB Binding Interactions in the Context of Neuroprotective Pathway in Brain Injuries

Authors: Janneth Gonzalez, Marco Avila, George Barreto

Abstract:

GRP78 participates in multiple functions in the cell during normal and pathological conditions, controlling calcium homeostasis, protein folding and unfolded protein response. GRP78 is located in the endoplasmic reticulum, but it can change its location under stress, hypoxic and apoptotic conditions. NF-κB represents the keystone of the inflammatory process and regulates the transcription of several genes related with apoptosis, differentiation, and cell growth. The possible relationship between GRP78-NF-κB could support and explain several mechanisms that may regulate a variety of cell functions, especially following brain injuries. Although several reports show interactions between NF-κB and heat shock proteins family members, there is a lack of information on how GRP78 may be interacting with NF-κB, and possibly regulating its downstream activation. Therefore, we assessed the computational predictions of the GRP78 (Chain A) and NF-κB complex (IkB alpha and p65) protein-protein interactions. The interaction interface of the docking model showed that the amino acids ASN 47, GLU 215, GLY 403 of GRP78 and THR 54, ASN 182 and HIS 184 of NF-κB are key residues involved in the docking. The electrostatic field between GRP78-NF-κB interfaces and molecular dynamic simulations support the possible interaction between the proteins. In conclusion, this work shed some light in the possible GRP78-NF-κB complex indicating key residues in this crosstalk, which may be used as an input for better drug design strategy targeting NF-κB downstream signaling as a new therapeutic approach following brain injuries.

Keywords: computational biology, protein interactions, Grp78, bioinformatics, molecular dynamics

Procedia PDF Downloads 341
11275 Pinch Technology for Minimization of Water Consumption at a Refinery

Authors: W. Mughees, M. Alahmad

Abstract:

Water is the most significant entity that controls local and global development. For the Gulf region, especially Saudi Arabia, with its limited potable water resources, the potential of the fresh water problem is highly considerable. In this research, the study involves the design and analysis of pinch-based water/wastewater networks. Multiple water/wastewater networks were developed using pinch analysis involving direct recycle/material recycle method. Property-integration technique was adopted to carry out direct recycle method. Particularly, a petroleum refinery was considered as a case study. In direct recycle methodology, minimum water discharge and minimum fresh water resource targets were estimated. Re-design (or retrofitting) of water allocation in the networks was undertaken. Chemical Oxygen Demand (COD) and hardness properties were taken as pollutants. This research was based on single and double contaminant approach for COD and hardness and the amount of fresh water was reduced from 340.0 m3/h to 149.0 m3/h (43.8%), 208.0 m3/h (61.18%) respectively. While regarding double contaminant approach, reduction in fresh water demand was 132.0 m3/h (38.8%). The required analysis was also carried out using mathematical programming technique. Operating software such as LINGO was used for these studies which have verified the graphical method results in a valuable and accurate way. Among the multiple water networks, the one possible water allocation network was developed based on mass exchange.

Keywords: minimization, water pinch, water management, pollution prevention

Procedia PDF Downloads 469
11274 The Impact of Mergers and Acquisitions on Financial Deepening in the Nigerian Banking Sector

Authors: Onyinyechi Joy Kingdom

Abstract:

Mergers and Acquisitions (M&A) have been proposed as a mechanism through which, problems associated with inefficiency or poor performance in financial institution could be addressed. The aim of this study is to examine the proposition that recapitalization of banks, which encouraged Mergers and Acquisitions in Nigeria banking system, would strengthen the domestic banks, improve financial deepening and the confidence of depositors. Hence, this study examines the impact of the 2005 M&A in the Nigerian-banking sector on financial deepening using mixed method (quantitative and qualitative approach). The quantitative process of this study utilised annual time series for financial deepening indicator for the period of 1997 to 2012. While, the qualitative aspect adopted semi-structured interview to collate data from three merged banks and three stand-alone banks to explore, understand and complement the quantitative results. Furthermore, a framework thematic analysis is employed to analyse the themes developed using NVivo 11 software. Using the quantitative approach, findings from the equality of mean test (EMT) used suggests that M&A have significant impact on financial deepening. However, this method is not robust enough given its weak validity as it does not control for other potential factors that may determine financial deepening. Thus, to control for other factors that may affect the level of financial deepening, a Multiple Regression Model (MRM) and Interrupted Times Series Analysis (ITSA) were applied. The coefficient for M&A dummy turned negative and insignificant using MRM. In addition, the estimated linear trend of the post intervention when ITSA was applied suggests that after M&A, the level of financial deepening decreased annually; however, this was statistically insignificant. Similarly, using the qualitative approach, the results from the interview supported the quantitative results from ITSA and MRM. The result suggests that interest rate should fall when capital base is increased to improve financial deepening. Hence, this study contributes to the existing literature the importance of other factors that may affect financial deepening and the economy when policies that will enhance bank performance and the economy are made. In addition, this study will enable the use of valuable policy instruments relevant to monetary authorities when formulating policies that will strengthen the Nigerian banking sector and the economy.

Keywords: mergers and acquisitions, recapitalization, financial deepening, efficiency, financial crisis

Procedia PDF Downloads 391
11273 Balance Control Mechanisms in Individuals With Multiple Sclerosis in Virtual Reality Environment

Authors: Badriah Alayidi, Emad Alyahya

Abstract:

Background: Most people with Multiple Sclerosis (MS) report worsening balance as the condition progresses. Poor balance control is also well known to be a significant risk factor for both falling and fear of falling. The increased risk of falls with disease progression thus makes balance control an essential target of gait rehabilitation amongst people with MS. Intervention programs have developed various methods to improve balance control, and accumulating evidence suggests that exercise programs may help people with MS improve their balance. Among these methods, virtual reality (VR) is growing in popularity as a balance-training technique owing to its potential benefits, including better compliance and greater user happiness. However, it is not clear if a VR environment will induce different balance control mechanisms in MS as compared to healthy individuals or traditional environments. Therefore, this study aims to examine how individuals with MS control their balance in a VR setting. Methodology: The proposed study takes an empirical approach to estimate and determine the role of balance response in persons with MS using a VR environment. It will use primary data collected through patient observations, physiological and biomechanical evaluation of balance, and data analysis. Results: The preliminary systematic review and meta-analysis indicated that there was variability in terms of the outcome assessing balance response in people with MS. The preliminary results of these assessments have the potential to provide essential indicators of the progression of MS and contribute to the individualization of treatment and evaluation of the interventions’ effectiveness. The literature describes patients who have had the opportunity to experiment in VR settings and then used what they have learned in the real world, suggesting that this VR setting could be more appealing than conditional settings. The findings of the proposed study will be beneficial in estimating and determining the effect of VR on balance control in persons with MS. In previous studies, VR was shown to be an interesting approach to neurological rehabilitation, but more data are needed to support this approach in MS. Conclusions: The proposed study enables an assessment of balance and evaluations of a variety of physiological implications related to neural activity as well as biomechanical implications related to movement analysis.

Keywords: multiple sclerosis, virtual reality, postural control, balance

Procedia PDF Downloads 70
11272 Sea of Light: A Game 'Based Approach for Evidence-Centered Assessment of Collaborative Problem Solving

Authors: Svenja Pieritz, Jakab Pilaszanovich

Abstract:

Collaborative Problem Solving (CPS) is recognized as being one of the most important skills of the 21st century with having a potential impact on education, job selection, and collaborative systems design. Therefore, CPS has been adopted in several standardized tests, including the Programme for International Student Assessment (PISA) in 2015. A significant challenge of evaluating CPS is the underlying interplay of cognitive and social skills, which requires a more holistic assessment. However, the majority of the existing tests are using a questionnaire-based assessment, which oversimplifies this interplay and undermines ecological validity. Two major difficulties were identified: Firstly, the creation of a controllable, real-time environment allowing natural behaviors and communication between at least two people. Secondly, the development of an appropriate method to collect and synthesize both cognitive and social metrics of collaboration. This paper proposes a more holistic and automated approach to the assessment of CPS. To address these two difficulties, a multiplayer problem-solving game called Sea of Light was developed: An environment allowing students to deploy a variety of measurable collaborative strategies. This controlled environment enables researchers to monitor behavior through the analysis of game actions and chat. The according solution for the statistical model is a combined approach of Natural Language Processing (NLP) and Bayesian network analysis. Social exchanges via the in-game chat are analyzed through NLP and fed into the Bayesian network along with other game actions. This Bayesian network synthesizes evidence to track and update different subdimensions of CPS. Major findings focus on the correlations between the evidences collected through in- game actions, the participants’ chat features and the CPS self- evaluation metrics. These results give an indication of which game mechanics can best describe CPS evaluation. Overall, Sea of Light gives test administrators control over different problem-solving scenarios and difficulties while keeping the student engaged. It enables a more complete assessment based on complex, socio-cognitive information on actions and communication. This tool permits further investigations of the effects of group constellations and personality in collaborative problem-solving.

Keywords: bayesian network, collaborative problem solving, game-based assessment, natural language processing

Procedia PDF Downloads 127
11271 Abandoning 'One-Time' Optional Information Literacy Workshops for Year 1 Medical Students and Gearing towards an 'Embedded Librarianship' Approach

Authors: R. L. David, E. C. P. Tan, M. A. Ferenczi

Abstract:

This study aimed to investigate the effect of a 'one-time' optional Information Literacy (IL) workshop to enhance Year 1 medical students' literature search, writing, and citation management skills as directed by a customized five-year IL framework developed for LKC Medicine students. At the end of the IL workshop, the overall rated 'somewhat difficult' when finding, citing, and using information from sources. The study method is experimental using a standardized IL test to study the cohort effect of a 'one-time' optional IL workshop on Year 1 students; experimental group in comparison to Year 2 students; control group. Test scores from both groups were compared and analyzed using mean scores and one-way analysis of variance (ANOVA). Unexpectedly, there were no statistically significant differences between group means as determined by One-Way ANOVA (F₁,₁₉₃ = 3.37, p = 0.068, ηp² = 0.017). Challenges and shortfalls posed by 'one-time' interventions raised a rich discussion to adopt an 'embedded librarianship' approach, which shifts the medial librarians' role into the curriculum and uses Team Based Learning to teach IL skills to medical students. The customized five-year IL framework developed for LKC Medicine students becomes a useful librarian-faculty model for embedding and bringing IL into the classroom.

Keywords: information literacy, 'one-time' interventions, medical students, standardized tests, embedded librarianship, curriculum, medical librarians

Procedia PDF Downloads 113
11270 Metadiscourse in Chinese and Thai Request Emails: Analysis and Pedagogical Application

Authors: Chia-Ling Hsieh, Kankanit Potikit

Abstract:

Metadiscourse refers to linguistic resources employed by writers to organize text and interact with readers. While metadiscourse has received considerable attention within the field of discourse analysis, few studies have explored the use of metadiscourse in email, one of the most popular forms of computer-mediated communication. Furthermore, the diversity of cross-linguistic research required to uncover the influence of cultural factors on metadiscourse use is lacking. The present study compares metadiscourse markers employed in Chinese and Thai-language request emails with the purpose of discovering cross-cultural similarities and differences that are meaningful and applicable to foreign language teaching. The analysis is based on a corpus of 200 request emails: 100 composed in Chinese and 100 in Thai, with half of the emails from each language data set addressed to professors and the other half addressed to classmates. Adopting Hyland’s model as an analytical framework, two primary categories of metadiscourse are identified. Textual metadiscourse helps to create text coherence, while interpersonal metadiscourse functions to convey authorial stance. Results of the study make clear that both Chinese and Thai-language emails use significantly more interpersonal markers than textual markers, indicating that email, as a unique communicative medium, is characterized by high degrees of concision and interactivity. Users of both languages further deploy similar patterns in writing emails to recipients of different social statuses. Compared with emails addressed to classmates, emails addressed to professors are notably longer and include more transition and engagement markers. Nevertheless, cultural factors do play a role. Emails composed in Thai, for example, include more textual markers than those in Chinese, as Thai favors formal expressions and detailed explanations, while in contrast, emails composed in Chinese employ more interpersonal markers than those in Thai, since Chinese tends to emphasize recipient involvement and attitudinal warmth. These findings thereby demonstrate the combined effects of email as a communicative medium, social status, and cultural values on metadiscourse usage. The study concludes by applying these findings to pedagogical suggestions for teaching email writing to Chinese and Thai language learners based on similarities and differences in metadiscourse strategy between the two languages.

Keywords: discourse analysis, email, metadiscourse, writing instruction

Procedia PDF Downloads 125
11269 Cleaning of Polycyclic Aromatic Hydrocarbons (PAH) Obtained from Ferroalloys Plant

Authors: Stefan Andersson, Balram Panjwani, Bernd Wittgens, Jan Erik Olsen

Abstract:

Polycyclic Aromatic hydrocarbons are organic compounds consisting of only hydrogen and carbon aromatic rings. PAH are neutral, non-polar molecules that are produced due to incomplete combustion of organic matter. These compounds are carcinogenic and interact with biological nucleophiles to inhibit the normal metabolic functions of the cells. Norways, the most important sources of PAH pollution is considered to be aluminum plants, the metallurgical industry, offshore oil activity, transport, and wood burning. Stricter governmental regulations regarding emissions to the outer and internal environment combined with increased awareness of the potential health effects have motivated Norwegian metal industries to increase their efforts to reduce emissions considerably. One of the objective of the ongoing industry and Norwegian research council supported "SCORE" project is to reduce potential PAH emissions from an off gas stream of a ferroalloy furnace through controlled combustion. In a dedicated combustion chamber. The sizing and configuration of the combustion chamber depends on the combined properties of the bulk gas stream and the properties of the PAH itself. In order to achieve efficient and complete combustion the residence time and minimum temperature need to be optimized. For this design approach reliable kinetic data of the individual PAH-species and/or groups thereof are necessary. However, kinetic data on the combustion of PAH are difficult to obtain and there is only a limited number of studies. The paper presents an evaluation of the kinetic data for some of the PAH obtained from literature. In the present study, the oxidation is modelled for pure PAH and also for PAH mixed with process gas. Using a perfectly stirred reactor modelling approach the oxidation is modelled including advanced reaction kinetics to study influence of residence time and temperature on the conversion of PAH to CO2 and water. A Chemical Reactor Network (CRN) approach is developed to understand the oxidation of PAH inside the combustion chamber. Chemical reactor network modeling has been found to be a valuable tool in the evaluation of oxidation behavior of PAH under various conditions.

Keywords: PAH, PSR, energy recovery, ferro alloy furnace

Procedia PDF Downloads 268
11268 Barriers and Facilitators to Inclusive Programming for Children with Mental and/or Developmental Challenges: A Participatory Action Research of Perspectives from Families and Professionals

Authors: Minnie Y. Teng, Kathy Xie, Jarus Tal

Abstract:

Rationale: The traditional approach to community programs for children with mental and/or developmental challenges often involves segregation from typically-developing peers. However, studies show that inclusive education improves children’s quality of life, self-concept, and long term health outcomes. Investigating factors that influence inclusion can thus have important implications in the design and facilitation of community programs such that all children - across a spectrum of needs and abilities - may benefit. Objectives: This study explores barriers and facilitators to inclusive community programming for children aged 0 to 12 with developmental/mental challenges. Methods: Using a participatory-action research methodology, semi-structured focus groups and interviews will be used to explore perspectives of sighted students, instructors, and staff. Data will be transcribed and coded thematically. Practice Implications or Results: By having a deeper understanding of the barriers and facilitators to inclusive programming in the community, researchers can work with the broader community to facilitate inclusion in children’s community programs. Conclusions: Expanding inclusive practices may improve the health and wellbeing of the pediatric populations with disabilities, which consistently reports lower levels of participation. These findings may help to identify gaps in existing practices and ways to approach them.

Keywords: aquatic programs, children, disabilities, inclusion, community programs

Procedia PDF Downloads 111
11267 From Shallow Semantic Representation to Deeper One: Verb Decomposition Approach

Authors: Aliaksandr Huminski

Abstract:

Semantic Role Labeling (SRL) as shallow semantic parsing approach includes recognition and labeling arguments of a verb in a sentence. Verb participants are linked with specific semantic roles (Agent, Patient, Instrument, Location, etc.). Thus, SRL can answer on key questions such as ‘Who’, ‘When’, ‘What’, ‘Where’ in a text and it is widely applied in dialog systems, question-answering, named entity recognition, information retrieval, and other fields of NLP. However, SRL has the following flaw: Two sentences with identical (or almost identical) meaning can have different semantic role structures. Let consider 2 sentences: (1) John put butter on the bread. (2) John buttered the bread. SRL for (1) and (2) will be significantly different. For the verb put in (1) it is [Agent + Patient + Goal], but for the verb butter in (2) it is [Agent + Goal]. It happens because of one of the most interesting and intriguing features of a verb: Its ability to capture participants as in the case of the verb butter, or their features as, say, in the case of the verb drink where the participant’s feature being liquid is shared with the verb. This capture looks like a total fusion of meaning and cannot be decomposed in direct way (in comparison with compound verbs like babysit or breastfeed). From this perspective, SRL looks really shallow to represent semantic structure. If the key point in semantic representation is an opportunity to use it for making inferences and finding hidden reasons, it assumes by default that two different but semantically identical sentences must have the same semantic structure. Otherwise we will have different inferences from the same meaning. To overcome the above-mentioned flaw, the following approach is suggested. Assume that: P is a participant of relation; F is a feature of a participant; Vcp is a verb that captures a participant; Vcf is a verb that captures a feature of a participant; Vpr is a primitive verb or a verb that does not capture any participant and represents only a relation. In another word, a primitive verb is a verb whose meaning does not include meanings from its surroundings. Then Vcp and Vcf can be decomposed as: Vcp = Vpr +P; Vcf = Vpr +F. If all Vcp and Vcf will be represented this way, then primitive verbs Vpr can be considered as a canonical form for SRL. As a result of that, there will be no hidden participants caught by a verb since all participants will be explicitly unfolded. An obvious example of Vpr is the verb go, which represents pure movement. In this case the verb drink can be represented as man-made movement of liquid into specific direction. Extraction and using primitive verbs for SRL create a canonical representation unique for semantically identical sentences. It leads to the unification of semantic representation. In this case, the critical flaw related to SRL will be resolved.

Keywords: decomposition, labeling, primitive verbs, semantic roles

Procedia PDF Downloads 360
11266 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis

Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García

Abstract:

Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.

Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis

Procedia PDF Downloads 221
11265 Runoff Estimates of Rapidly Urbanizing Indian Cities: An Integrated Modeling Approach

Authors: Rupesh S. Gundewar, Kanchan C. Khare

Abstract:

Runoff contribution from urban areas is generally from manmade structures and few natural contributors. The manmade structures are buildings; roads and other paved areas whereas natural contributors are groundwater and overland flows etc. Runoff alleviation is done by manmade as well as natural storages. Manmade storages are storage tanks or other storage structures such as soakways or soak pits which are more common in western and European countries. Natural storages are catchment slope, infiltration, catchment length, channel rerouting, drainage density, depression storage etc. A literature survey on the manmade and natural storages/inflow has presented percentage contribution of each individually. Sanders et.al. in their research have reported that a vegetation canopy reduces runoff by 7% to 12%. Nassif et el in their research have reported that catchment slope has an impact of 16% on bare standard soil and 24% on grassed soil on rainfall runoff. Infiltration being a pervious/impervious ratio dependent parameter is catchment specific. But a literature survey has presented a range of 15% to 30% loss of rainfall runoff in various catchment study areas. Catchment length and channel rerouting too play a considerable role in reduction of rainfall runoff. Ground infiltration inflow adds to the runoff where the groundwater table is very shallow and soil saturates even in a lower intensity storm. An approximate percent contribution through this inflow and surface inflow contributes to about 2% of total runoff volume. Considering the various contributing factors in runoff it has been observed during a literature survey that integrated modelling approach needs to be considered. The traditional storm water network models are able to predict to a fair/acceptable degree of accuracy provided no interaction with receiving water (river, sea, canal etc), ground infiltration, treatment works etc. are assumed. When such interactions are significant then it becomes difficult to reproduce the actual flood extent using the traditional discrete modelling approach. As a result the correct flooding situation is very rarely addressed accurately. Since the development of spatially distributed hydrologic model the predictions have become more accurate at the cost of requiring more accurate spatial information.The integrated approach provides a greater understanding of performance of the entire catchment. It enables to identify the source of flow in the system, understand how it is conveyed and also its impact on the receiving body. It also confirms important pain points, hydraulic controls and the source of flooding which could not be easily understood with discrete modelling approach. This also enables the decision makers to identify solutions which can be spread throughout the catchment rather than being concentrated at single point where the problem exists. Thus it can be concluded from the literature survey that the representation of urban details can be a key differentiator to the successful understanding of flooding issue. The intent of this study is to accurately predict the runoff from impermeable areas from urban area in India. A representative area has been selected for which data was available and predictions have been made which are corroborated with the actual measured data.

Keywords: runoff, urbanization, impermeable response, flooding

Procedia PDF Downloads 245
11264 D3Advert: Data-Driven Decision Making for Ad Personalization through Personality Analysis Using BiLSTM Network

Authors: Sandesh Achar

Abstract:

Personalized advertising holds greater potential for higher conversion rates compared to generic advertisements. However, its widespread application in the retail industry faces challenges due to complex implementation processes. These complexities impede the swift adoption of personalized advertisement on a large scale. Personalized advertisement, being a data-driven approach, necessitates consumer-related data, adding to its complexity. This paper introduces an innovative data-driven decision-making framework, D3Advert, which personalizes advertisements by analyzing personalities using a BiLSTM network. The framework utilizes the Myers–Briggs Type Indicator (MBTI) dataset for development. The employed BiLSTM network, specifically designed and optimized for D3Advert, classifies user personalities into one of the sixteen MBTI categories based on their social media posts. The classification accuracy is 86.42%, with precision, recall, and F1-Score values of 85.11%, 84.14%, and 83.89%, respectively. The D3Advert framework personalizes advertisements based on these personality classifications. Experimental implementation and performance analysis of D3Advert demonstrate a 40% improvement in impressions. D3Advert’s innovative and straightforward approach has the potential to transform personalized advertising and foster widespread personalized advertisement adoption in marketing.

Keywords: personalized advertisement, deep Learning, MBTI dataset, BiLSTM network, NLP.

Procedia PDF Downloads 38