Search results for: noisy forensic speaker verification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1072

Search results for: noisy forensic speaker verification

262 Real-Time Monitoring of Drinking Water Quality Using Advanced Devices

Authors: Amani Abdallah, Isam Shahrour

Abstract:

The quality of drinking water is a major concern of public health. The control of this quality is generally performed in the laboratory, which requires a long time. This type of control is not adapted for accidental pollution from sudden events, which can have serious consequences on population health. Therefore, it is of major interest to develop real-time innovative solutions for the detection of accidental contamination in drinking water systems This paper presents researches conducted within the SunRise Demonstrator for ‘Smart and Sustainable Cities’ with a particular focus on the supervision of the water quality. This work aims at (i) implementing a smart water system in a large water network (Campus of the University Lille1) including innovative equipment for real-time detection of abnormal events, such as those related to the contamination of drinking water and (ii) develop a numerical modeling of the contamination diffusion in the water distribution system. The first step included verification of the water quality sensors and their effectiveness on a network prototype of 50m length. This part included the evaluation of the efficiency of these sensors in the detection both bacterial and chemical contamination events in drinking water distribution systems. An on-line optical sensor integral with a laboratory-scale distribution system (LDS) was shown to respond rapidly to changes in refractive index induced by injected loads of chemical (cadmium, mercury) and biological contaminations (Escherichia coli). All injected substances were detected by the sensor; the magnitude of the response depends on the type of contaminant introduced and it is proportional to the injected substance concentration.

Keywords: distribution system, drinking water, refraction index, sensor, real-time

Procedia PDF Downloads 327
261 Development of an Integrated Criminogenic Intervention Programme for High Risk Offenders

Authors: Yunfan Jiang

Abstract:

In response to an identified gap in available treatment programmes for high-risk offenders with multiple criminogenic needs and guided by emerging literature in the field of correctional rehabilitation, Singapore Prison Service (SPS) developed the Integrated Criminogenic Programme (ICP) in 2012. This evidence-informed psychological programme was designed to address all seven dynamic criminogenic needs (from the Central 8) of high-risk offenders by applying concepts from rehabilitation and psychological theories such as Risk-Need-Responsivity, Good Lives Model, narrative identity, and motivational interviewing. This programme also encompasses a 6-month community maintenance component for the purpose of providing structured step-down support in the aftercare setting. These sessions provide participants the opportunity for knowledge reinforcement and application of skills attained in-care. A quantitative evaluation of the ICP showed that the intervention group had statistically significant improvements across time in most self-report measures of criminal attitudes, substance use attitudes, and psychosocial functioning. This was congruent with qualitative data from participants saying that the ICP had the most impact on their criminal thinking patterns and management of behaviours in high-risk situations. Results from the comparison group showed no difference in their criminal attitudes, even though they reported statistically significant improvements across time in their substance use attitudes and some self-report measures of psychosocial functioning. The programme’s efficacy was also apparent in the lower rates of recidivism and relapse within 12 months for the intervention group. The management of staff issues arising from the development and implementation of an innovative high-intensity psychological programme such as the ICP will also be discussed.

Keywords: evaluation, forensic psychology, intervention programme, offender rehabilitation

Procedia PDF Downloads 561
260 Affects Associations Analysis in Emergency Situations

Authors: Joanna Grzybowska, Magdalena Igras, Mariusz Ziółko

Abstract:

Association rule learning is an approach for discovering interesting relationships in large databases. The analysis of relations, invisible at first glance, is a source of new knowledge which can be subsequently used for prediction. We used this data mining technique (which is an automatic and objective method) to learn about interesting affects associations in a corpus of emergency phone calls. We also made an attempt to match revealed rules with their possible situational context. The corpus was collected and subjectively annotated by two researchers. Each of 3306 recordings contains information on emotion: (1) type (sadness, weariness, anxiety, surprise, stress, anger, frustration, calm, relief, compassion, contentment, amusement, joy) (2) valence (negative, neutral, or positive) (3) intensity (low, typical, alternating, high). Also, additional information, that is a clue to speaker’s emotional state, was annotated: speech rate (slow, normal, fast), characteristic vocabulary (filled pauses, repeated words) and conversation style (normal, chaotic). Exponentially many rules can be extracted from a set of items (an item is a previously annotated single information). To generate the rules in the form of an implication X → Y (where X and Y are frequent k-itemsets) the Apriori algorithm was used - it avoids performing needless computations. Then, two basic measures (Support and Confidence) and several additional symmetric and asymmetric objective measures (e.g. Laplace, Conviction, Interest Factor, Cosine, correlation coefficient) were calculated for each rule. Each applied interestingness measure revealed different rules - we selected some top rules for each measure. Owing to the specificity of the corpus (emergency situations), most of the strong rules contain only negative emotions. There are though strong rules including neutral or even positive emotions. Three examples of the strongest rules are: {sadness} → {anxiety}; {sadness, weariness, stress, frustration} → {anger}; {compassion} → {sadness}. Association rule learning revealed the strongest configurations of affects (as well as configurations of affects with affect-related information) in our emergency phone calls corpus. The acquired knowledge can be used for prediction to fulfill the emotional profile of a new caller. Furthermore, a rule-related possible context analysis may be a clue to the situation a caller is in.

Keywords: data mining, emergency phone calls, emotional profiles, rules

Procedia PDF Downloads 392
259 Nonlinear Aerodynamic Parameter Estimation of a Supersonic Air to Air Missile by Using Artificial Neural Networks

Authors: Tugba Bayoglu

Abstract:

Aerodynamic parameter estimation is very crucial in missile design phase, since accurate high fidelity aerodynamic model is required for designing high performance and robust control system, developing high fidelity flight simulations and verification of computational and wind tunnel test results. However, in literature, there is not enough missile aerodynamic parameter identification study for three main reasons: (1) most air to air missiles cannot fly with constant speed, (2) missile flight test number and flight duration are much less than that of fixed wing aircraft, (3) variation of the missile aerodynamic parameters with respect to Mach number is higher than that of fixed wing aircraft. In addition to these challenges, identification of aerodynamic parameters for high wind angles by using classical estimation techniques brings another difficulty in the estimation process. The reason for this, most of the estimation techniques require employing polynomials or splines to model the behavior of the aerodynamics. However, for the missiles with a large variation of aerodynamic parameters with respect to flight variables, the order of the proposed model increases, which brings computational burden and complexity. Therefore, in this study, it is aimed to solve nonlinear aerodynamic parameter identification problem for a supersonic air to air missile by using Artificial Neural Networks. The method proposed will be tested by using simulated data which will be generated with a six degree of freedom missile model, involving a nonlinear aerodynamic database. The data will be corrupted by adding noise to the measurement model. Then, by using the flight variables and measurements, the parameters will be estimated. Finally, the prediction accuracy will be investigated.

Keywords: air to air missile, artificial neural networks, open loop simulation, parameter identification

Procedia PDF Downloads 252
258 Unlocking Justice: Exploring the Power and Challenges of DNA Analysis in the Criminal Justice System

Authors: Sandhra M. Pillai

Abstract:

This article examines the relevance, difficulties, and potential applications of DNA analysis in the criminal justice system. A potent tool for connecting suspects to crime sites, clearing the innocent of wrongdoing, and resolving cold cases, DNA analysis has transformed forensic investigations. The scientific foundations of DNA analysis, including DNA extraction, sequencing, and statistical analysis, are covered in the article. To guarantee accurate and trustworthy findings, it also discusses the significance of quality assurance procedures, chain of custody, and DNA sample storage. DNA analysis has significantly advanced science, but it also brings up substantial moral and legal issues. To safeguard individual rights and uphold public confidence, privacy concerns, possible discrimination, and abuse of DNA information must be properly addressed. The paper also emphasises the effects of the criminal justice system on people and communities while highlighting the necessity of equity, openness, and fair access to DNA testing. The essay describes the obstacles and future directions for DNA analysis. It looks at cutting-edge technology like next-generation sequencing, which promises to make DNA analysis quicker and more affordable. To secure the appropriate and informed use of DNA evidence, it also emphasises the significance of multidisciplinary collaboration among scientists, law enforcement organisations, legal experts, and policymakers. In conclusion, DNA analysis has enormous potential for improving the course of criminal justice. We can exploit the potential of DNA technology while respecting the ideals of justice, fairness, and individual rights by navigating the ethical, legal, and societal issues and encouraging discussion and collaboration.

Keywords: DNA analysis, DNA evidence, reliability, validity, legal frame, admissibility, ethical considerations, impact, future direction, challenges

Procedia PDF Downloads 47
257 Barriers and Opportunities in Apprenticeship Training: How to Complete a Vocational Upper Secondary Qualification with Intermediate Finnish Language Skills

Authors: Inkeri Jaaskelainen

Abstract:

The aim of this study is to shed light on what is it like to study in apprenticeship training using intermediate (or even lower level) Finnish. The aim is to find out and describe these students' experiences and feelings while acquiring a profession in Finnish as it is important to understand how immigrant background adult learners learn and how their needs could be better taken into account. Many students choose apprenticeships and start vocational training while their language skills in Finnish are still very weak. At work, students should be able to simultaneously learn Finnish and do vocational studies in a noisy, demanding, and stressful environment. Learning and understanding new things is very challenging under these circumstances, and sometimes students get exhausted and experience a lot of stress - which makes learning even more difficult. Students are different from each other, and so are their ways to learn. Both duties at work and school assignments require reasonably good general language skills, and, especially at work, language skills are also a safety issue. The empirical target of this study is a group of students with an immigrant background who studied in various fields with intensive L2 support in 2016–2018 and who by now have completed a vocational upper secondary qualification. The interview material for this narrative study was collected from those who completed apprenticeship training in 2019–2020. The data collection methods used are a structured thematic interview, a questionnaire, and observational data. Interviewees with an immigrant background have an inconsistent cultural and educational background - some have completed an academic degree in their country of origin while others have learned to read and write only in Finland. The analysis of the material utilizes thematic analysis, which is used to examine learning and related experiences. Learning a language at work is very different from traditional classroom teaching. With evolving language skills, at an intermediate level at best, rushing and stressing makes it even more difficult to understand and increases the fear of failure. Constant noise, rapidly changing situations, and uncertainty undermine the learning and well-being of apprentices. According to preliminary results, apprenticeship training is well suited to the needs of an adult immigrant student. In apprenticeship training, students need a lot of support for learning and understanding a new communication and working culture. Stress can result in, e.g., fatigue, frustration, and difficulties in remembering and understanding. Apprenticeship training can be seen as a good path to working life. However, L2 support is a very important part of apprenticeship training, and it indeed helps students to believe that one day they will graduate and even get employed in their new country.

Keywords: apprenticeship training, vocational basic degree, Finnish learning, wee-being

Procedia PDF Downloads 115
256 Trinary Affinity—Mathematic Verification and Application (1): Construction of Formulas for the Composite and Prime Numbers

Authors: Liang Ming Zhong, Yu Zhong, Wen Zhong, Fei Fei Yin

Abstract:

Trinary affinity is a description of existence: every object exists as it is known and spoken of, in a system of 2 differences (denoted dif1, dif₂) and 1 similarity (Sim), equivalently expressed as dif₁ / Sim / dif₂ and kn / 0 / tkn (kn = the known, tkn = the 'to be known', 0 = the zero point of knowing). They are mathematically verified and illustrated in this paper by the arrangement of all integers onto 3 columns, where each number exists as a difference in relation to another number as another difference, and the 2 difs as arbitrated by a third number as the Sim, resulting in a trinary affinity or trinity of 3 numbers, of which one is the known, the other the 'to be known', and the third the zero (0) from which both the kn and tkn are measured and specified. Consequently, any number is horizontally specified either as 3n, or as '3n – 1' or '3n + 1', and vertically as 'Cn + c', so that any number seems to occur at the intersection of its X and Y axes and represented by its X and Y coordinates, as any point on Earth’s surface by its latitude and longitude. Technically, i) primes are viewed and treated as progenitors, and composites as descending from them, forming families of composites, each capable of being measured and specified from its own zero called in this paper the realistic zero (denoted 0r, as contrasted to the mathematic zero, 0m), which corresponds to the constant c, and the nature of which separates the composite and prime numbers, and ii) any number is considered as having a magnitude as well as a position, so that a number is verified as a prime first by referring to its descriptive formula and then by making sure that no composite number can possibly occur on its position, by dividing it with factors provided by the composite number formulas. The paper consists of 3 parts: 1) a brief explanation of the trinary affinity of things, 2) the 8 formulas that represent ALL the primes, and 3) families of composite numbers, each represented by a formula. A composite number family is described as 3n + f₁‧f₂. Since there are an infinitely large number of composite number families, to verify the primality of a great probable prime, we have to have it divided with several or many a f₁ from a range of composite number formulas, a procedure that is as laborious as it is the surest way to verifying a great number’s primality. (So, it is possible to substitute planned division for trial division.)

Keywords: trinary affinity, difference, similarity, realistic zero

Procedia PDF Downloads 186
255 Qualitative Analysis of Current Child Custody Evaluation Practices

Authors: Carolyn J. Ortega, Stephen E. Berger

Abstract:

The role of the custody evaluator is perhaps one of the most controversial and risky endeavors in clinical practice. Complaints filed with licensing boards regarding a child-custody evaluation constitute the second most common reason for such an event. Although the evaluator is expected to answer for the family-law court what is in the “best interest of the child,” there is a lack of clarity on how to establish this in any empirically validated manner. Hence, practitioners must contend with a nebulous framework in formulating their methodological procedures that inherently places them at risk in an already litigious context. This study sought to qualitatively investigate patterns of practice among doctoral practitioners conducting child custody evaluations in the area of Southern California. Ten psychologists were interviewed who devoted between 25 and 100% of their California private practice to custody work. All held Ph.D. degrees with a range of eight to 36 years of experience in custody work. Semi-structured interviews were used to investigate assessment practices, ensure adherence to guidelines, risk management, and qualities of evaluators. Forty-three Specific Themes were identified using Interpretive Phenomenological Analysis (IPA). Seven Higher Order Themes clustered on salient factors such as use of Ethics, Law, Guidelines; Parent Variables; Child Variables; Psychologist Variables; Testing; Literature; and Trends. Evaluators were aware of the ever-present reality of a licensure complaint and thus presented idiosyncratic descriptions of risk management considerations. Ambiguity about quantifying and validly tapping parenting abilities was also reviewed. Findings from this study suggested a high reliance on unstructured and observational methods in child custody practices.

Keywords: forensic psychology, psychological testing, assessment methodology, child custody

Procedia PDF Downloads 264
254 Improving Electrical Safety through Enhanced Work Permits

Authors: Nuwan Karunarathna, Hemali Seneviratne

Abstract:

Distribution Utilities inherently present electrical hazards for their workers in addition to the general public especially due to bare overhead lines spreading out over a large geographical area. Therefore, certain procedures such as; de-energization, verification of de-energization, isolation, lock-out tag-out and earthing are carried out to ensure safe working conditions when conducting maintenance work on de-energized overhead lines. However, measures must be taken to coordinate the above procedures and to ensure successful and accurate execution of those procedures. Issuing of 'Work Permits' is such a measure that is used by the Distribution Utility considered in this paper. Unfortunately, the Work Permit method adopted by the Distribution Utility concerned here has not been successful in creating the safe working conditions as expected which was evidenced by four (4) number of fatalities of workers due to electrocution occurred in the Distribution Utility from 2016 to 2018. Therefore, this paper attempts to identify deficiencies in the Work Permit method and related contributing factors through careful analysis of the four (4) fatalities and work place practices to rectify the short comings to prevent future incidents. The analysis shows that the present level of coordination between the 'Authorized Person' who issues the work permit and the 'Competent Person' who performs the actual work is grossly inadequate to achieve the intended safe working conditions. The paper identifies the need of active participation of a 'Control Person' who oversees the whole operation from a bird’s eye perspective and recommends further measures that are derived through the analysis of the fatalities to address the identified lapses in the current work permit system.

Keywords: authorized person, competent person, control person, de-energization, distribution utility, isolation, lock-out tag-out, overhead lines, work permit

Procedia PDF Downloads 112
253 Space Weather and Earthquakes: A Case Study of Solar Flare X9.3 Class on September 6, 2017

Authors: Viktor Novikov, Yuri Ruzhin

Abstract:

The studies completed to-date on a relation of the Earth's seismicity and solar processes provide the fuzzy and contradictory results. For verification of an idea that solar flares can trigger earthquakes, we have analyzed a case of a powerful surge of solar flash activity early in September 2017 during approaching the minimum of 24th solar cycle was accompanied by significant disturbances of space weather. On September 6, 2017, a group of sunspots AR2673 generated a large solar flare of X9.3 class, the strongest flare over the past twelve years. Its explosion produced a coronal mass ejection partially directed towards the Earth. We carried out a statistical analysis of the catalogs of earthquakes USGS and EMSC for determination of the effect of solar flares on global seismic activity. New evidence of earthquake triggering due to the Sun-Earth interaction has been demonstrated by simple comparison of behavior of Earth's seismicity before and after the strong solar flare. The global number of earthquakes with magnitude of 2.5 to 5.5 within 11 days after the solar flare has increased by 30 to 100%. A possibility of electric/electromagnetic triggering of earthquake due to space weather disturbances is supported by results of field and laboratory studies, where the earthquakes (both natural and laboratory) were initiated by injection of electrical current into the Earth crust. For the specific case of artificial electric earthquake triggering the current density at a depth of earthquake, sources are comparable with estimations of a density of telluric currents induced by variation of space weather conditions due to solar flares. Acknowledgment: The work was supported by RFBR grant No. 18-05-00255.

Keywords: solar flare, earthquake activity, earthquake triggering, solar-terrestrial relations

Procedia PDF Downloads 123
252 Biliteracy and Latinidad: Catholic Youth Group as a Site of Cosmopolitan Identity Building

Authors: Natasha Perez

Abstract:

This autobiographical narrative inquiry explores the relationship between religious practice, identity, language and literacy in the author’s life experience as a second-generation Cuban-American growing up in the bilingual spaces of South Florida. The author describes how the social practices around language, including the flexibility to communicate in English and Spanish simultaneously, known as translanguaging, were instrumental to developing a biliterate cosmopolitan identity, along with a greater sense of Latinidad through interactions with diverse Latinx church members. This narrative study involved cycles of writing, reading, and reflection within a three-dimensional narrative inquiry space in order to discover the ways in which language and literacy development in the relationship between the personal and the social, across time and space, as historically situated phenomena. The findings show that Catholic faith practices have always been a source and expression of Cuban-ness, a means of sustaining Cuban identity, as well as a medium for bilingual language and literacy practice in the author’s life. Despite lacking formal literacy education in Spanish, she benefitted from the Catholic Church’s response to the surge of Spanish-speaking immigrants in South Florida in the 1980s and the subsequent flexibility of language practice in church-sponsored youth groups. The faith-sharing practices of the youth group created a space to use Spanish in more sophisticated ways that served to build confidence as a bilingual speaker and expand bilingual competence. These experiences also helped the author develop a more salient identity as Cuban-American and a deeper connection to her Cuban-ness in relation to the Nicaraguan, Venezuelan, and first-generation Cuban identities of my peers. The youth group also fostered cosmopolitan identity building through interactions with pan-ethnic Spanish speakers, with Catholicism as a common language and culture that served as a uniting force. Interaction with these peers also fostered cosmopolitan understandings that deepened the author’s knowledge of the geographical boundaries, political realities, and socio-historical differences between these groups of immigrants. This narrative study opens a window onto the micro-processes and socio-cultural dynamics of language and identity development in the second generation, with the potential to deepen our understanding of the impact of religious practice on these.

Keywords: literacy, religion, identity, comopolitanism, culture, language, translanguaging

Procedia PDF Downloads 72
251 Analysis of Surface Hardness, Surface Roughness and near Surface Microstructure of AISI 4140 Steel Worked with Turn-Assisted Deep Cold Rolling Process

Authors: P. R. Prabhu, S. M. Kulkarni, S. S. Sharma, K. Jagannath, Achutha Kini U.

Abstract:

In the present study, response surface methodology has been used to optimize turn-assisted deep cold rolling process of AISI 4140 steel. A regression model is developed to predict surface hardness and surface roughness using response surface methodology and central composite design. In the development of predictive model, deep cold rolling force, ball diameter, initial roughness of the workpiece, and number of tool passes are considered as model variables. The rolling force and the ball diameter are the significant factors on the surface hardness and ball diameter and numbers of tool passes are found to be significant for surface roughness. The predicted surface hardness and surface roughness values and the subsequent verification experiments under the optimal operating conditions confirmed the validity of the predicted model. The absolute average error between the experimental and predicted values at the optimal combination of parameter settings for surface hardness and surface roughness is calculated as 0.16% and 1.58% respectively. Using the optimal processing parameters, the hardness is improved from 225 to 306 HV, which resulted in an increase in the near surface hardness by about 36% and the surface roughness is improved from 4.84µm to 0.252 µm, which resulted in decrease in the surface roughness by about 95%. The depth of compression is found to be more than 300µm from the microstructure analysis and this is in correlation with the results obtained from the microhardness measurements. Taylor Hobson Talysurf tester, micro Vickers hardness tester, optical microscopy and X-ray diffractometer are used to characterize the modified surface layer.

Keywords: hardness, response surface methodology, microstructure, central composite design, deep cold rolling, surface roughness

Procedia PDF Downloads 391
250 Advanced Biosensor Characterization of Phage-Mediated Lysis in Real-Time and under Native Conditions

Authors: Radka Obořilová, Hana Šimečková, Matěj Pastucha, Jan Přibyl, Petr Skládal, Ivana Mašlaňová, Zdeněk Farka

Abstract:

Due to the spreading of antimicrobial resistance, alternative approaches to combat superinfections are being sought, both in the field of lysing agents and methods for studying bacterial lysis. A suitable alternative to antibiotics is phage therapy and enzybiotics, for which it is also necessary to study the mechanism of their action. Biosensor-based techniques allow rapid detection of pathogens in real time, verification of sensitivity to commonly used antimicrobial agents, and selection of suitable lysis agents. The detection of lysis takes place on the surface of the biosensor with immobilized bacteria, which has the potential to be used to study biofilms. An example of such a biosensor is surface plasmon resonance (SPR), which records the kinetics of bacterial lysis based on a change in the resonance angle. The bacteria are immobilized on the surface of the SPR chip, and the action of phage as the mass loss is monitored after a typical lytic cycle delay. Atomic force microscopy (AFM) is a technique for imaging of samples on the surface. In contrast to electron microscopy, it has the advantage of real-time imaging in the native conditions of the nutrient medium. In our case, Staphylococcus aureus was lysed using the enzyme lysostaphin and phage P68 from the familyPodoviridae at 37 ° C. In addition to visualization, AFM was used to study changes in mechanical properties during lysis, which resulted in a reduction of Young’s modulus (E) after disruption of the bacterial wall. Changes in E reflect the stiffness of the bacterium. These advanced methods provide deeper insight into bacterial lysis and can help to fight against bacterial diseases.

Keywords: biosensors, atomic force microscopy, surface plasmon resonance, bacterial lysis, staphylococcus aureus, phage P68

Procedia PDF Downloads 118
249 Identification of the Putative Interactome of Escherichia coli Glutaredoxin 2 by Affinity Chromatography

Authors: Eleni Poulou-Sidiropoulou, Charalampos N. Bompas, Martina Samiotaki, Alexios Vlamis-Gardikas

Abstract:

The glutaredoxin (Grx) and thioredoxin (Trx) systems keep the intracellular environment reduced in almost all organisms. In Escherichia coli (E. coli), the Grx system relies on NADPH+ to reduce GSH reductase (GR), the latter reducing oxidized diglutathione to glutathione (GSH) which in turn reduces cytosolic Grxs, the electron donors for different intracellular substrates. In the Trx system, GR and GSH are replaced by Trx reductase (TrxR). Three of the Grxs of E. coli (Grx1, 2, 3) are reduced by GSH, while Grx4 is likely reduced by TrxR. Trx1 and Grx1 from E. coli may reduce ribonucleotide reductase Ia to ensure a constant supply of deoxyribonucleotides for the synthesis of DNA. The role of the other three Grxs is relatively unknown, especially for Grx2 that may amount up to 1 % of total cellular protein in the stationary phase of growth. The protein is known as a potent antioxidant, but no specific functions have been attributed to it. Herein, affinity chromatography of cellular extracts on immobilized Grx2, followed by MS analysis of the resulting eluates, was employed to identify protein ligands that could provide insights into the biological role of Grx2. Ionic, strong non-covalent, and covalent (disulfide) interactions with relevant proteins were detected. As a means of verification, the identified ligands were subjected to in silico docking with monothiol Grx2. In other experiments, protein extracts from E. coli cells lacking the gene for Grx2 (grxB) were compared to those of wild type. Taken together, the two approaches suggest that Grx2 is involved in protein synthesis, nucleotide metabolism, DNA damage repair, stress responses, and various metabolic processes. Grx2 appears as a versatile protein that may participate in a wide range of biological pathways beyond its known general antioxidant function.

Keywords: Escherichia coli, glutaredoxin 2, interactome, thiol-disulfide oxidoreductase

Procedia PDF Downloads 33
248 Sudden Death in Young Patients: A Study of 312 Autopsy Cases

Authors: N. Haj Salem, M. Belhadj, S. Ben Jomâa, S. Saadi, R. Dhouieb, A. Chadly

Abstract:

Introduction: Sudden death in young is seen as a dramatic phenomenon requiring knowledge of its impact and determining their causes. Aim: We aim to study the epidemiological characteristics of sudden death in young, and to discuss the mechanism and the importance of autopsy in these situations. Material and methods: We performed a retrospective cohort study using autopsy data from the department of forensic medicine at the University Hospital of Fattouma Bourguiba, Monastir-Tunisia. A review of all autopsies performed during 23 years was done. In each case, clinical information and circumstances of death were obtained. We have included all sudden death in persons aged between 1 year and 35 years for the male and from one year to 45 years for female. We collected 312 cases of sudden death during the studied period. The collected data were processed using SPSS 20. The significance level was set at 0.05. Results: Thirty-two cases of cardiac ischemic sudden death have been collected. Myocardial infarction was the second cause of sudden death in young patients. There was a male predominance. The most affected subjects were aged between 25-45 years. The death occurred more frequently at rest. Coronary artery disease has been discovered in twenty-four cases (75%). A severe coronary artery disease was observed in two children with medical history of familial hypercholesterolemia. The myocardial infarction occurred in healthy coronary arteries in eight cases. An anomalous course of coronary arteries, in particular, myocardial bridging, was found in eight cases (25%). Toxicological screening was negative in all cases. Second cause of death was hypertrophic cardiomyopathy. Neurological and respiratory causes of death were implicated respectively in 10% and 15%. Conclusion: Identifying epidemiological characteristics of sudden death in this population is important for guiding approaches to prevention that must be based on dietary hygienic measures and the control of cardiovascular risk factors.

Keywords: autopsy, cardiac death, sudden death, young

Procedia PDF Downloads 219
247 Foreign Language Faculty Mentorship in Vietnam: An Interpretive Qualitative Study

Authors: Hung Tran

Abstract:

This interpretive qualitative study employed three theoretical lenses: Bronfenbrenner’s (1979) Ecological System of Human Development, Vygotsky’s (1978) Sociocultural Theory of Development, and Knowles’s (1970) Adult Learning Theory as the theoretical framework in connection with the constructivist research paradigm to investigate into positive and negative aspects of the extant English as a Foreign Language (EFL) faculty mentoring programs at four higher education institutions (HEIs) in the Mekong River Delta (MRD) of Vietnam. Four apprentice faculty members (mentees), four experienced faculty members (mentors), and two associate deans (administrators) from these HEIs participated in two tape-recorded individual interviews in the Vietnamese language. Twenty interviews were transcribed verbatim and translated into English with verification. The initial analysis of data reveals that the mentoring program, which is mandated by Vietnam’s Ministry of Education and Training, has been implemented differently at these HEIs due to a lack of officially-documented mentoring guidance. Other general themes emerging from the data include essentials of the mentoring program, approaches of the mentoring practice, the mentee – mentor relationship, and lifelong learning beyond the mentoring program. Practically, this study offers stakeholders in the mentoring cycle description of benefits and best practices of tertiary EFL mentorship and a suggested mentoring program that is metaphorically depicted as “a lifebuoy” for its current and potential administrators and mentors to help their mentees survive in the first years of teaching. Theoretically, this study contributes to the world’s growing knowledge of post-secondary mentorship by enriching the modest literature on Asian tertiary EFL mentorship.

Keywords: faculty mentorship, mentees, mentors, administrator, the MRD, Vietnam

Procedia PDF Downloads 107
246 Establishment of a Test Bed for Integrated Map of Underground Space and Verification of GPR Exploration Equipment

Authors: Jisong Ryu, Woosik Lee, Yonggu Jang

Abstract:

The paper discusses the process of establishing a reliable test bed for verifying the usability of Ground Penetrating Radar (GPR) exploration equipment based on an integrated underground spatial map in Korea. The aim of this study is to construct a test bed consisting of metal and non-metal pipelines to verify the performance of GPR equipment and improve the accuracy of the underground spatial integrated map. The study involved the design and construction of a test bed for metal and non-metal pipe detecting tests. The test bed was built in the SOC Demonstration Research Center (Yeoncheon) of the Korea Institute of Civil Engineering and Building Technology, burying metal and non-metal pipelines up to a depth of 5m. The test bed was designed in both vehicle-type and cart-type GPR-mounted equipment. The study collected data through the construction of the test bed and conducting metal and non-metal pipe detecting tests. The study analyzed the reliability of GPR detecting results by comparing them with the basic drawings, such as the underground space integrated map. The study contributes to the improvement of GPR equipment performance evaluation and the accuracy of the underground spatial integrated map, which is essential for urban planning and construction. The study addressed the question of how to verify the usability of GPR exploration equipment based on an integrated underground spatial map and improve its performance. The study found that the test bed is reliable for verifying the performance of GPR exploration equipment and accurately detecting metal and non-metal pipelines using an integrated underground spatial map. The study concludes that the establishment of a test bed for verifying the usability of GPR exploration equipment based on an integrated underground spatial map is essential. The proposed Korean-style test bed can be used for the evaluation of GPR equipment performance and support the construction of a national non-metal pipeline exploration equipment performance evaluation center in Korea.

Keywords: Korea-style GPR testbed, GPR, metal pipe detecting, non-metal pipe detecting

Procedia PDF Downloads 78
245 Applications of Forensics/DNA Tools in Combating Gender-Based Violence: A Case Study in Nigeria

Authors: Edeaghe Ehikhamenor, Jennifer Nnamdi

Abstract:

Introduction: Gender-based violence (GBV) was a well-known global crisis before the COVID-19 pandemic. The pandemic burden only intensified the crisis. With prevailing lockdowns, increased poverty due to high unemployment, especially affecting females, and other mobility restrictions that have left many women trapped with their abusers, plus isolation from social contact and support networks, GBV cases spiraled out of control. Prevalence of economic with cultural disparity, which is greatly manifested in Nigeria, is a major contributory factor to GBV. This is made worst by religious adherents where the females are virtually relegated to the background. Our societal approaches to investigations and sanctions to culprits have not sufficiently applied forensic/DNA tools in combating these major vices. Violence against women or some rare cases against men can prevent them from carrying out their duties regardless of the position they hold. Objective: The main objective of this research is to highlight the origin of GBV, the victims, types, contributing factors, and the applications of forensics/DNA tools and remedies so as to minimize GBV in our society. Methods: Descriptive information was obtained through the search on our daily newspapers, electronic media, google scholar websites, other authors' observations and personal experiences, plus anecdotal reports. Results: Findings from our exploratory searches revealed a high incidence of GBV with very limited or no applications of Forensics/DNA tools as an intervening mechanism to reduce GBV in Nigeria. Conclusion: Nigeria needs to develop clear-cut policies on forensics/DNA tools in terms of institutional framework to develop a curriculum for the training of all stakeholders to fast-track justice for victims of GBV so as to serve as a deterrent to other culprits.

Keywords: gender-based violence, forensics, DNA, justice

Procedia PDF Downloads 60
244 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models

Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin

Abstract:

Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.

Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR

Procedia PDF Downloads 127
243 Attitude in Academic Writing (CAAW): Corpus Compilation and Annotation

Authors: Hortènsia Curell, Ana Fernández-Montraveta

Abstract:

This paper presents the creation, development, and analysis of a corpus designed to study the presence of attitude markers and author’s stance in research articles in two different areas of linguistics (theoretical linguistics and sociolinguistics). These two disciplines are expected to behave differently in this respect, given the disparity in their discursive conventions. Attitude markers in this work are understood as the linguistic elements (adjectives, nouns and verbs) used to convey the writer's stance towards the content presented in the article, and are crucial in understanding writer-reader interaction and the writer's position. These attitude markers are divided into three broad classes: assessment, significance, and emotion. In addition to them, we also consider first-person singular and plural pronouns and possessives, modal verbs, and passive constructions, which are other linguistic elements expressing the author’s stance. The corpus, Corpus of Attitude in Academic Writing (CAAW), comprises a collection of 21 articles, collected from six journals indexed in JCR. These articles were originally written in English by a single native-speaker author from the UK or USA and were published between 2022 and 2023. The total number of words in the corpus is approximately 222,400, with 106,422 from theoretical linguistics (Lingua, Linguistic Inquiry and Journal of Linguistics) and 116,022 from sociolinguistics journals (International Journal of the Sociology of Language, Language in Society and Journal of Sociolinguistics). Together with the corpus, we present the tool created for the creation and storage of the corpus, along with a tool for automatic annotation. The steps followed in the compilation of the corpus are as follows. First, the articles were selected according to the parameters explained above. Second, they were downloaded and converted to txt format. Finally, examples, direct quotes, section titles and references were eliminated, since they do not involve the author’s stance. The resulting texts were the input for the annotation of the linguistic features related to stance. As for the annotation, two articles (one from each subdiscipline) were annotated manually by the two researchers. An existing list was used as a baseline, and other attitude markers were identified, together with the other elements mentioned above. Once a consensus was reached, the rest of articles were annotated automatically using the tool created for this purpose. The annotated corpus will serve as a resource for scholars working in discourse analysis (both in linguistics and communication) and related fields, since it offers new insights into the expression of attitude. The tools created for the compilation and annotation of the corpus will be useful to study author’s attitude and stance in articles from any academic discipline: new data can be uploaded and the list of markers can be enlarged. Finally, the tool can be expanded to other languages, which will allow cross-linguistic studies of author’s stance.

Keywords: academic writing, attitude, corpus, english

Procedia PDF Downloads 45
242 A Novel Methodology for Browser Forensics to Retrieve Searched Keywords from Windows 10 Physical Memory Dump

Authors: Dija Sulekha

Abstract:

Nowadays, a good percentage of reported cybercrimes involve the usage of the Internet, directly or indirectly for committing the crime. Usually, Web Browsers leave traces of browsing activities on the host computer’s hard disk, which can be used by investigators to identify internet-based activities of the suspect. But criminals, who involve in some organized crimes, disable browser file generation feature to hide the evidence while doing illegal activities through the Internet. In such cases, even though browser files were not generated in the storage media of the system, traces of recent and ongoing activities were generated in the Physical Memory of the system. As a result, the analysis of Physical Memory Dump collected from the suspect's machine retrieves lots of forensically crucial information related to the browsing history of the Suspect. This information enables the cyber forensic investigators to concentrate on a few highly relevant selected artefacts while doing the Offline Forensics analysis of storage media. This paper addresses the reconstruction of web browsing activities by conducting live forensics to identify searched terms, downloaded files, visited sites, email headers, email ids, etc. from the physical memory dump collected from Windows 10 Systems. Well-known entry points are available for retrieving all the above artefacts except searched terms. The paper describes a novel methodology to retrieve the searched terms from Windows 10 Physical Memory. The searched terms retrieved in this way can be used for doing advanced file and keyword search in the storage media files reconstructed from the file system recovery in offline forensics.

Keywords: browser forensics, digital forensics, live Forensics, physical memory forensics

Procedia PDF Downloads 98
241 FE Modelling of Structural Effects of Alkali-Silica Reaction in Reinforced Concrete Beams

Authors: Mehdi Habibagahi, Shami Nejadi, Ata Aminfar

Abstract:

A significant degradation factor that impacts the durability of concrete structures is the alkali-silica reaction. Engineers are frequently charged with the challenges of conducting a thorough safety assessment of concrete structures that have been impacted by ASR. The alkali-silica reaction has a major influence on the structural capacities of structures. In most cases, the reduction in compressive strength, tensile strength, and modulus of elasticity is expressed as a function of free expansion and crack widths. Predicting the effect of ASR on flexural strength is also relevant. In this paper, a nonlinear three-dimensional (3D) finite-element model was proposed to describe the flexural strength degradation induced byASR.Initial strains, initial stresses, initial cracks, and deterioration of material characteristics were all considered ASR factors in this model. The effects of ASR on structural performance were evaluated by focusing on initial flexural stiffness, force–deformation curve, and load-carrying capacity. Degradation of concrete mechanical properties was correlated with ASR growth using material test data conducted at Tech Lab, UTS, and implemented into the FEM for various expansions. The finite element study revealed a better understanding of the ASR-affected RC beam's failure mechanism and capacity reduction as a function of ASR expansion. Furthermore, in this study, decreasing of the residual mechanical properties due to ASRisreviewed, using as input data for the FEM model. Finally, analysis techniques and a comparison of the analysis and the experiment results are discussed. Verification is also provided through analyses of reinforced concrete beams with behavior governed by either flexural or shear mechanisms.

Keywords: alkali-silica reaction, analysis, assessment, finite element, nonlinear analysis, reinforced concrete

Procedia PDF Downloads 148
240 Uniqueness of Fingerprint Biometrics to Human Dynasty: A Review

Authors: Siddharatha Sharma

Abstract:

With the advent of technology and machines, the role of biometrics in society is taking an important place for secured living. Security issues are the major concern in today’s world and continue to grow in intensity and complexity. Biometrics based recognition, which involves precise measurement of the characteristics of living beings, is not a new method. Fingerprints are being used for several years by law enforcement and forensic agencies to identify the culprits and apprehend them. Biometrics is based on four basic principles i.e. (i) uniqueness, (ii) accuracy, (iii) permanency and (iv) peculiarity. In today’s world fingerprints are the most popular and unique biometrics method claiming a social benefit in the government sponsored programs. A remarkable example of the same is UIDAI (Unique Identification Authority of India) in India. In case of fingerprint biometrics the matching accuracy is very high. It has been observed empirically that even the identical twins also do not have similar prints. With the passage of time there has been an immense progress in the techniques of sensing computational speed, operating environment and the storage capabilities and it has become more user convenient. Only a small fraction of the population may be unsuitable for automatic identification because of genetic factors, aging, environmental or occupational reasons for example workers who have cuts and bruises on their hands which keep fingerprints changing. Fingerprints are limited to human beings only because of the presence of volar skin with corrugated ridges which are unique to this species. Fingerprint biometrics has proved to be a high level authentication system for identification of the human beings. Though it has limitations, for example it may be inefficient and ineffective if ridges of finger(s) or palm are moist authentication becomes difficult. This paper would focus on uniqueness of fingerprints to the human beings in comparison to other living beings and review the advancement in emerging technologies and their limitations.

Keywords: fingerprinting, biometrics, human beings, authentication

Procedia PDF Downloads 301
239 Child Abuse: Emotional, Physical, Neglect, Sexual and the Psychological Effects: A Case Scenario in Lagos State

Authors: Aminu Ololade Matilda

Abstract:

Child abuse is a significant issue worldwide, affecting the socio-development and mental and physical health of young individuals. It is the maltreatment of a child by an adult or a child. This paper focuses on child abuse in Communities in Lagos State. The aim of this study is to investigate the extent of child abuse and its impact on the mood, social activities, self-worth, concentration, and academic performance of children in Communities in Lagos State. The primary research instrument used in this study was the interview (Forensic), which consisted of two sections. The first section gathered data on the details of the child and the forms and impacts of abuse experienced, while the second section focused on parental style. The study found that children who experienced various forms of abuse, such as emotional, neglect, physical, or sexual abuse, were hesitant to report it out of fear of threats or even death from the abuser. These abused children displayed withdrawn behaviour, depression, and low self-worth and underperformed academically compared to their peers who did not experience abuse. The findings align with socio-learning and intergenerational transmission of violence theories, which suggest that parents and caregivers who engage in child abuse often do so because they themselves experienced or witnessed abuse as children, thereby normalizing violence. The study highlights the prevalent issue of child abuse in Lagos State and emphasizes the need for advocacy programs and capacity building to raise awareness about child abuse and prevention. The distribution of the Child’s Rights Act in various sectors is also recommended to underscore the importance of protecting the rights of children. Additionally, the inclusion of courses on child abuse in the school curriculum is proposed to ensure children are educated on recognizing and reporting abuse.

Keywords: abuse, child, awareness, effects, emotional, neglect, physical, psychological, sexual, recognize, reporting, right

Procedia PDF Downloads 58
238 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 39
237 Colored Image Classification Using Quantum Convolutional Neural Networks Approach

Authors: Farina Riaz, Shahab Abdulla, Srinjoy Ganguly, Hajime Suzuki, Ravinesh C. Deo, Susan Hopkins

Abstract:

Recently, quantum machine learning has received significant attention. For various types of data, including text and images, numerous quantum machine learning (QML) models have been created and are being tested. Images are exceedingly complex data components that demand more processing power. Despite being mature, classical machine learning still has difficulties with big data applications. Furthermore, quantum technology has revolutionized how machine learning is thought of, by employing quantum features to address optimization issues. Since quantum hardware is currently extremely noisy, it is not practicable to run machine learning algorithms on it without risking the production of inaccurate results. To discover the advantages of quantum versus classical approaches, this research has concentrated on colored image data. Deep learning classification models are currently being created on Quantum platforms, but they are still in a very early stage. Black and white benchmark image datasets like MNIST and Fashion MINIST have been used in recent research. MNIST and CIFAR-10 were compared for binary classification, but the comparison showed that MNIST performed more accurately than colored CIFAR-10. This research will evaluate the performance of the QML algorithm on the colored benchmark dataset CIFAR-10 to advance QML's real-time applicability. However, deep learning classification models have not been developed to compare colored images like Quantum Convolutional Neural Network (QCNN) to determine how much it is better to classical. Only a few models, such as quantum variational circuits, take colored images. The methodology adopted in this research is a hybrid approach by using penny lane as a simulator. To process the 10 classes of CIFAR-10, the image data has been translated into grey scale and the 28 × 28-pixel image containing 10,000 test and 50,000 training images were used. The objective of this work is to determine how much the quantum approach can outperform a classical approach for a comprehensive dataset of color images. After pre-processing 50,000 images from a classical computer, the QCNN model adopted a hybrid method and encoded the images into a quantum simulator for feature extraction using quantum gate rotations. The measurements were carried out on the classical computer after the rotations were applied. According to the results, we note that the QCNN approach is ~12% more effective than the traditional classical CNN approaches and it is possible that applying data augmentation may increase the accuracy. This study has demonstrated that quantum machine and deep learning models can be relatively superior to the classical machine learning approaches in terms of their processing speed and accuracy when used to perform classification on colored classes.

Keywords: CIFAR-10, quantum convolutional neural networks, quantum deep learning, quantum machine learning

Procedia PDF Downloads 100
236 The Influence of Market Attractiveness and Core Competence on Value Creation Strategy and Competitive Advantage and Its Implication on Business Performance

Authors: Firsan Nova

Abstract:

The average Indonesian watches 5.5 hours of TV a day. With a population of 242 million people and a Free-to-Air (FTA) TV penetration rate of 56%, that equates to 745 million hours of television watched each day. With such potential, it is no wonder that many companies are now attempting to get into the Pay TV market. Research firm Media Partner Asia has forecast in its study that the number of Indonesian pay-television subscribers will climb from 2.4 million in 2012 to 8.7 million by 2020, with penetration scaling up from 7 percent to 21 percent. Key drivers of market growth, the study says, include macro trends built around higher disposable income and a rising middle class, with leading players continuing to invest significantly in sales, distribution and content. New entrants, in the meantime, will boost overall prospects. This study aims to examine and analyze the effect of Market Attractiveness and the Core Competence on Value Creation and Competitive Advantage and its impact to Business Performance in the pay TV industry in Indonesia. The study using strategic management science approach with the census method in which all members of the population are as sample. Verification method is used to examine the relationship between variables. The unit of analysis in this research is all Indonesian Pay TV business units totaling 19 business units. The unit of observation is the director and managers of each business unit. Hypothesis testing is performed by using statistical Partial Least Square (PLS). The conclusion of the study shows that the market attractiveness affects business performance through value creation and competitive advantage. The appropriate value creation comes from the company ability to optimize its core competence and exploit market attractiveness. Value creation affects competitive advantage. The competitive advantage can be determined based on the company's ability to create value for customers and the competitive advantage has an impact on business performance.

Keywords: market attractiveness, core competence, value creation, competitive advantage, business performance

Procedia PDF Downloads 329
235 Finite Element Modeling and Nonlinear Analysis for Seismic Assessment of Off-Diagonal Steel Braced RC Frame

Authors: Keyvan Ramin

Abstract:

The geometric nonlinearity of Off-Diagonal Bracing System (ODBS) could be a complementary system to covering and extending the nonlinearity of reinforced concrete material. Finite element modeling is performed for flexural frame, x-braced frame and the ODBS braced frame system at the initial phase. Then the different models are investigated along various analyses. According to the experimental results of flexural and x-braced frame, the verification is done. Analytical assessments are performed in according to three-dimensional finite element modeling. Non-linear static analysis is considered to obtain performance level and seismic behavior, and then the response modification factors calculated from each model’s pushover curve. In the next phase, the evaluation of cracks observed in the finite element models, especially for RC members of all three systems is performed. The finite element assessment is performed on engendered cracks in ODBS braced frame for various time steps. The nonlinear dynamic time history analysis accomplished in different stories models for three records of Elcentro, Naghan, and Tabas earthquake accelerograms. Dynamic analysis is performed after scaling accelerogram on each type of flexural frame, x-braced frame and ODBS braced frame one by one. The base-point on RC frame is considered to investigate proportional displacement under each record. Hysteresis curves are assessed along continuing this study. The equivalent viscous damping for ODBS system is estimated in according to references. Results in each section show the ODBS system has an acceptable seismic behavior and their conclusions have been converged when the ODBS system is utilized in reinforced concrete frame.

Keywords: FEM, seismic behaviour, pushover analysis, geometric nonlinearity, time history analysis, equivalent viscous damping, passive control, crack investigation, hysteresis curve

Procedia PDF Downloads 361
234 Angiogenic and Immunomodulatory Properties and Phenotype of Mesenchymal Stromal Cells Can Be Regulated by Cytokine Treatment

Authors: Ekaterina Zubkova, Irina Beloglazova, Iurii Stafeev, Konsyantin Dergilev, Yelena Parfyonova, Mikhail Menshikov

Abstract:

Mesenchymal stromal cells from adipose tissue (MSC) currently are widely used in regenerative medicine to restore the function of damaged tissues, but that is significantly hampered by their heterogeneity. One of the modern approaches to overcoming this obstacle is the polarization of cell subpopulations into a specific phenotype under the influence of cytokines and other factors that activate receptors and signal transmission to cells. We polarized MSC with factors affecting the inflammatory signaling and functional properties of cells, followed by verification of their expression profile and ability to affect the polarization of macrophages. RT-PCR evaluation showed that cells treated with LPS, interleukin-17, tumor necrosis factor α (TNF α), primarily express pro-inflammatory factors and cytokines, and after treatment with polyninosin polycytidic acid and interleukin-4 (IL4) anti-inflammatory factors and some proinflammatory factors. MSC polarized with pro-inflammatory cytokines showed a more robust pro-angiogenic effect in fibrin gel bead 3D angiogenesis assay. Further, we evaluated the possibility of paracrine effects of MSCs on the polarization of intact macrophages. Polarization efficiency was assesed by expression of M1/M2 phenotype markers CD80 and CD206. We showed that conditioned media from MSC preincubated in the presence of IL-4 cause an increase in CD206 expression similar to that observed in M2 macrophages. Conditioned media from MSC polarized in the presence of LPS or TNF-α increased the expression of CD80 antigen in macrophages, similar to that observed in M1 macrophages. In other cases, a pronounced paracrine effect of MSC on the polarization of macrophages was not detected. Thus, our study showed that the polarization of MSC along the pro-inflammatory or anti-inflammatory pathway allows us to obtain cell subpopulations that have a multidirectional modulating effect on the polarization of macrophages. (RFBR grants 20-015-00405 and 18-015-00398.)

Keywords: angiogenesis, cytokines, mesenchymal, polarization, inflammation

Procedia PDF Downloads 150
233 A Novel Study Contrasting Traditional Autopsy with Post-Mortem Computed Tomography in Falls Leading to Death

Authors: Balaji Devanathan, Gokul G., Abilash S., Abhishek Yadav, Sudhir K. Gupta

Abstract:

Background: As an alternative to the traditional autopsy, a virtual autopsy is carried out using scanning and imaging technologies, mainly post-mortem computed tomography (PMCT). This facility aims to supplement traditional autopsy results and reduce or eliminate internal dissection in subsequent autopsies. For emotional and religious reasons, the deceased's relatives have historically disapproved such interior dissection. The non-invasive, objective, and preservative PMCT is what friends and family would rather have than a traditional autopsy. Additionally, it aids in the examination of the technologies and the benefits and drawbacks of each, demonstrating the significance of contemporary imaging in the field of forensic medicine. Results: One hundred falls resulting in fatalities was analysed by the writers. Before the autopsy, each case underwent a PMCT examination using a 16-slice Multi-Slice CT spiral scanner. By using specialised software, MPR and VR reconstructions were carried out following the capture of the raw images. The accurate detection of fractures in the skull, face bones, clavicle, scapula, and vertebra was better observed in comparison to a routine autopsy. The interpretation of pneumothorax, Pneumoperitoneum, pneumocephalus, and hemosiuns are much enhanced by PMCT than traditional autopsy. Conclusion. It is useful to visualise the skeletal damage in fall from height cases using a virtual autopsy based on PMCT. So, the ideal tool in traumatising patients is a virtual autopsy based on PMCT scans. When assessing trauma victims, PMCT should be viewed as an additional helpful tool to traditional autopsy. This is because it can identify additional bone fractures in body parts that are challenging to examine during autopsy, such as posterior regions, which helps the pathologist reconstruct the victim's life and determine the cause of death.

Keywords: PMCT, fall from height, autopsy, fracture

Procedia PDF Downloads 18