Search results for: wide dynamic range
1418 Peach as a Potential Functional Food: Biological Activity and Important Phenolic Compound Source
Authors: Luís R. Silva, Catarina Bento, Ana C. Gonçalves, Fábio Jesus, Branca M. Silva
Abstract:
Nowadays, the general population is more and more concerned about nutrition and the health implications of an unbalanced diet. Current knowledge regarding the health benefits and antioxidant properties of certain foods such as fruits and vegetables has gained the interest of both the general public and scientific community. Peach (Prunus persica (L.) Batsch) is one of the most consumed fruits worldwide, with low sugar contents and a broad range of nutrients essential to the normal functioning of the body. Six different peach cultivars from the Fundão region in Portugal were evaluated regarding their phenolic composition by LC-DAD and biological activity. The prepared extracts’ capacity to scavenge free-radicals was tested through the stable free radical DPPH• and nitric oxide (•NO). Additionally, antidiabetic potential and protective effects against peroxyl radical (ROO•) induced damage to erythrocytes were also tested. LC-DAD analysis allowed the identification of 17 phenolic compounds, among which 5-O-caffeoylquinic acids and 3-O-caffeoylquinic acids are pointed out as the most abundant. Regarding the antioxidant activity, all cultivars displayed concentration-dependent free-radical scavenging activity against both nitrogen species and DPPH•. In respect to α-glucosidase inhibitory activity, Royal Magister and Royal Glory presented the highest inhibitory activity (IC50 = 11.7 ± 1.4 and 17.1 ± 1.7 μg/mL, respectively), nevertheless all six cultivars presented higher activity than the control acarbose. As for the protective effect of Royal Lu extract on the oxidative damage induced in erythrocytes by ROO•, the results were quite promising showing inhibition IC50 values of 110.0 ± 4.5 μg/mL and 83.8 ± 6.5 μg/mL for hemolysis and hemoglobin oxidation, respectively. The demonstrated activity is of course associated to the peaches’ phenolic profile, rich in phenolic acids and flavonoids with high hydrogen donating capacity. These compounds have great industrial interest for the manufacturing of natural products. The following step would naturally be the extraction and isolation from the plant tissues and large-scale production through biotechnology techniques.Keywords: antioxidants, functional food, phenolic compounds, peach
Procedia PDF Downloads 2941417 Towards a Strategic Framework for State-Level Epistemological Functions
Authors: Mark Darius Juszczak
Abstract:
While epistemology, as a sub-field of philosophy, is generally concerned with theoretical questions about the nature of knowledge, the explosion in digital media technologies has resulted in an exponential increase in the storage and transmission of human information. That increase has resulted in a particular non-linear dynamic – digital epistemological functions are radically altering how and what we know. Neither the rate of that change nor the consequences of it have been well studied or taken into account in developing state-level strategies for epistemological functions. At the current time, US Federal policy, like that of virtually all other countries, maintains, at the national state level, clearly defined boundaries between various epistemological agencies - agencies that, in one way or another, mediate the functional use of knowledge. These agencies can take the form of patent and trademark offices, national library and archive systems, departments of education, departments such as the FTC, university systems and regulations, military research systems such as DARPA, federal scientific research agencies, medical and pharmaceutical accreditation agencies, federal funding for scientific research and legislative committees and subcommittees that attempt to alter the laws that govern epistemological functions. All of these agencies are in the constant process of creating, analyzing, and regulating knowledge. Those processes are, at the most general level, epistemological functions – they act upon and define what knowledge is. At the same time, however, there are no high-level strategic epistemological directives or frameworks that define those functions. The only time in US history where a proxy state-level epistemological strategy existed was between 1961 and 1969 when the Kennedy Administration committed the United States to the Apollo program. While that program had a singular technical objective as its outcome, that objective was so technologically advanced for its day and so complex so that it required a massive redirection of state-level epistemological functions – in essence, a broad and diverse set of state-level agencies suddenly found themselves working together towards a common epistemological goal. This paper does not call for a repeat of the Apollo program. Rather, its purpose is to investigate the minimum structural requirements for a national state-level epistemological strategy in the United States. In addition, this paper also seeks to analyze how the epistemological work of the multitude of national agencies within the United States would be affected by such a high-level framework. This paper is an exploratory study of this type of framework. The primary hypothesis of the author is that such a function is possible but would require extensive re-framing and reclassification of traditional epistemological functions at the respective agency level. In much the same way that, for example, DHS (Department of Homeland Security) evolved to respond to a new type of security threat in the world for the United States, it is theorized that a lack of coordination and alignment in epistemological functions will equally result in a strategic threat to the United States.Keywords: strategic security, epistemological functions, epistemological agencies, Apollo program
Procedia PDF Downloads 771416 Liposome Loaded Polysaccharide Based Hydrogels: Promising Delayed Release Biomaterials
Authors: J. Desbrieres, M. Popa, C. Peptu, S. Bacaita
Abstract:
Because of their favorable properties (non-toxicity, biodegradability, mucoadhesivity etc.), polysaccharides were studied as biomaterials and as pharmaceutical excipients in drug formulations. These formulations may be produced in a wide variety of forms including hydrogels, hydrogel based particles (or capsules), films etc. In these formulations, the polysaccharide based materials are able to provide local delivery of loaded therapeutic agents but their delivery can be rapid and not easily time-controllable due to, particularly, the burst effect. This leads to a loss in drug efficiency and lifetime. To overcome the consequences of burst effect, systems involving liposomes incorporated into polysaccharide hydrogels may appear as a promising material in tissue engineering, regenerative medicine and drug loading systems. Liposomes are spherical self-closed structures, composed of curved lipid bilayers, which enclose part of the surrounding solvent into their structure. The simplicity of production, their biocompatibility, the size and similar composition of cells, the possibility of size adjustment for specific applications, the ability of hydrophilic or/and hydrophobic drug loading make them a revolutionary tool in nanomedicine and biomedical domain. Drug delivery systems were developed as hydrogels containing chitosan or carboxymethylcellulose (CMC) as polysaccharides and gelatin (GEL) as polypeptide, and phosphatidylcholine or phosphatidylcholine/cholesterol liposomes able to accurately control this delivery, without any burst effect. Hydrogels based on CMC were covalently crosslinked using glutaraldehyde, whereas chitosan based hydrogels were double crosslinked (ionically using sodium tripolyphosphate or sodium sulphate and covalently using glutaraldehyde). It has been proven that the liposome integrity is highly protected during the crosslinking procedure for the formation of the film network. Calcein was used as model active matter for delivery experiments. Multi-Lamellar vesicles (MLV) and Small Uni-Lamellar Vesicles (SUV) were prepared and compared. The liposomes are well distributed throughout the whole area of the film, and the vesicle distribution is equivalent (for both types of liposomes evaluated) on the film surface as well as deeper (100 microns) in the film matrix. An obvious decrease of the burst effect was observed in presence of liposomes as well as a uniform increase of calcein release that continues even at large time scales. Liposomes act as an extra barrier for calcein release. Systems containing MLVs release higher amounts of calcein compared to systems containing SUVs, although these liposomes are more stable in the matrix and diffuse with difficulty. This difference comes from the higher quantity of calcein present within the MLV in relation with their size. Modeling of release kinetics curves was performed and the release of hydrophilic drugs may be described by a multi-scale mechanism characterized by four distinct phases, each of them being characterized by a different kinetics model (Higuchi equation, Korsmeyer-Peppas model etc.). Knowledge of such models will be a very interesting tool for designing new formulations for tissue engineering, regenerative medicine and drug delivery systems.Keywords: controlled and delayed release, hydrogels, liposomes, polysaccharides
Procedia PDF Downloads 2261415 Apathetic Place, Hostile Space: A Qualitative Study on the Ability of Immigration Detention in the UK to Promote the Health and Dignity of Detainees
Abstract:
Background: The UK has one of the largest immigration detention estates in Europe and is under increasing scrutiny, particularly regarding the lack of transparency over the use of detention and the conditions. Therefore, this research seeks to explore the professional perceptions of the ability of immigration detention in the UK to promote health and dignity. Methods: A phenomenological approach to qualitative methods were used, with social constructivist theorisations of health and dignity. Seven semi-structured interviews were conducted using Microsoft Teams. Participants included a range of immigration detention stakeholders who have visited closed immigration detention centres in the UK in a professional capacity. Recorded interviews were transcribed verbatim, and analysis was data-driven through inductive reflexive thematic analysis of the entire data set to account for the small sample size. This study received ethical approval from University College London Research Ethics Committee. Results: Two global themes were created through analysis: apathetic place and hostile space. Apathetic place discusses the lack of concern for detainees' daily living and healthcare needs within immigration detention in the UK. This is explored through participants' perceptions of the lack of ability of monitoring and evaluation processes to ensure detainees are able to live with dignity and understand the unfulfilled duty of care that exists in detention. Hostile space discusses immigration detention in the UK as a wider system of hostility. This is explored through the disempowering impact on detainees, the perception of a failing system as a result of inadequate safeguarding procedures, and a belief that the intention of immigration detention is misaligned with its described purpose. Conclusion: This research explains why the current immigration detention system in the UK is unable to promote health and dignity, offering a social justice and action-orientated approach to research in this sphere. The findings strengthen the discourse against the use of detention as an immigration control tool in the UK. Implications for further research include a stronger emphasis on investigating alternatives to detention and culturally considerate opportunities for patient-centred healthcare.Keywords: access to healthcare, dignity, health, immigration detention, migrant, refugee, UK
Procedia PDF Downloads 1031414 Maintaining Energy Security in Natural Gas Pipeline Operations by Empowering Process Safety Principles Through Alarm Management Applications
Authors: Huseyin Sinan Gunesli
Abstract:
Process Safety Management is a disciplined framework for managing the integrity of systems and processes that handle hazardous substances. It relies on good design principles, well-implemented automation systems, and operating and maintenance practices. Alarm Management Systems play a critically important role in the safe and efficient operation of modern industrial plants. In that respect, Alarm Management is one of the critical factors feeding the safe operations of the plants in the manner of applying effective process safety principles. Trans Anatolian Natural Gas Pipeline (TANAP) is part of the Southern Gas Corridor, which extends from the Caspian Sea to Italy. TANAP transports Natural Gas from the Shah Deniz gas field of Azerbaijan, and possibly from other neighboring countries, to Turkey and through Trans Adriatic Pipeline (TAP) Pipeline to Europe. TANAP plays a crucial role in maintaining Energy Security for the region and Europe. In that respect, the application of Process Safety principles is vital to deliver safe, reliable and efficient Natural Gas delivery to Shippers both in the region and Europe. Effective Alarm Management is one of those Process Safety principles which feeds safe operations of the TANAP pipeline. Alarm Philosophy was designed and implemented in TANAP Pipeline according to the relevant standards. However, it is essential to manage the alarms received in the control room effectively to maintain safe operations. In that respect, TANAP has commenced Alarm Management & Rationalization program as of February 2022 after transferring to Plateau Regime, reaching the design parameters. While Alarm Rationalization started, there were more than circa 2300 alarms received per hour from one of the compressor stations. After applying alarm management principles such as reviewing and removal of bad actors, standing, stale, chattering, fleeting alarms, comprehensive review and revision of alarm set points through a change management principle, conducting alarm audits/design verification and etc., it has been achieved to reduce down to circa 40 alarms per hour. After the successful implementation of alarm management principles as specified above, the number of alarms has been reduced to industry standards. That significantly improved operator vigilance to focus on mainly important and critical alarms to avoid any excursion beyond safe operating limits leading to any potential process safety events. Following the ‟What Gets Measured, Gets Managed” principle, TANAP has identified key Performance Indicators (KPIs) to manage Process Safety principles effectively, where Alarm Management has formed one of the key parameters of those KPIs. However, review and analysis of the alarms were performed manually. Without utilizing Alarm Management Software, achieving full compliance with international standards is almost infeasible. In that respect, TANAP has started using one of the industry-wide known Alarm Management Applications to maintain full review and analysis of alarms and define actions as required. That actually significantly empowered TANAP’s process safety principles in terms of Alarm Management.Keywords: process safety principles, energy security, natural gas pipeline operations, alarm rationalization, alarm management, alarm management application
Procedia PDF Downloads 1031413 Development of a PJWF Cleaning Method for Wet Electrostatic Precipitators
Authors: Hsueh-Hsing Lu, Thi-Cuc Le, Tung-Sheng Tsai, Chuen-Jinn Tsai
Abstract:
This study designed and tested a novel wet electrostatic precipitators (WEP) system featuring a Pulse-Air-Jet-Assisted Water Flow (PJWF) to shorten water cleaning time, reduce water usage, and maintain high particle removal efficiency. The PJWF injected cleaning water tangentially at the cylinder wall, rapidly enhancing the momentum of the water flow for efficient dust cake removal. Each PJWF cycle uses approximately 4.8 liters of cleaning water in 18 seconds. Comprehensive laboratory tests were conducted using a single-tube WEP prototype within a flow rate range of 3.0 to 6.0 cubic meters per minute(CMM), operating voltages between -35 to -55 kV, and high-frequency power supply. The prototype, consisting of 72 sets of double-spike rigid discharge electrodes, demonstrated that with the PJWF, -35 kV, and 3.0 CMM, the PM2.5 collection efficiency remained as high as the initial value of 88.02±0.92% after loading with Al2O3 particles at 35.75± 2.54 mg/Nm3 for 20-hr continuous operation. In contrast, without the PJWF, the PM2.5 collection efficiency drastically dropped from 87.4% to 53.5%. Theoretical modeling closely matched experimental results, confirming the robustness of the system's design and its scalability for larger industrial applications. Future research will focus on optimizing the PJWF system, exploring its performance with various particulate matter, and ensuring long-term operational stability and reliability under diverse environmental conditions. Recently, this WEP was combined with a preceding CT (cooling tower) and a HWS (honeycomb wet scrubber) and pilot-tested (40 CMM) to remove SO2 and PM2.5 emissions in a sintering plant of an integrated steel making plant. Pilot-test results showed that the removal efficiencies for SO2 and PM2.5 emissions are as high as 99.7 and 99.3 %, respectively, with ultralow emitted concentrations of 0.3 ppm and 0.07 mg/m3, respectively, while the white smoke is also eliminated at the same time. These new technologies are being used in the industry and the application in different fields is expected to be expanded to reduce air pollutant emissions substantially for a better ambient air quality.Keywords: wet electrostatic precipitator, pulse-air-jet-assisted water flow, particle removal efficiency, air pollution control
Procedia PDF Downloads 201412 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods
Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer
Abstract:
Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.Keywords: cross-validation, importance sampling, information criteria, predictive accuracy
Procedia PDF Downloads 3921411 Dermatophytoses: Spectrum Evolution of Dermatophytes in Sfax, Tunisia, Between 1999 and 2019
Authors: Khemakhem Nahed, Hammami Fatma, Trabelsi Houaida, Neji Sourour, Sellami Hayet, Makni Fattouma, Turki Hamida, Ayadi Ali
Abstract:
Dermatophytoses are considered a public health problem and represent 10% of dermatological consultations in our region. Their epidemiology is influenced by various factors, such as lifestyle, human migration patterns, changes in the environment and the host relationship. The understanding of epidemiology has a major impact on their prevention and treatment. The aim of the study is to determine the prevalence pattern of aetiological agents and to describe the clinical characteristics of dermatophytoses between 1999 and 2019. Out of 65 059 subjects suspected to have superficial mycoses, 36 220 (55.67%) were affected with dermatophytoses. The mean age was 40.1 years (range: 10 days to 99 years). The sex ratio was 0.8. Our patients were from urban regions in 80.9% of cases. The most common type of infection was onychomycosis (42.64%), followed by tinea pedis (20.8%), intertrigo (18.3%), tinea corporis (8.48%) and tinea capitis (7.87%). The most isolated dermatophyte was Trichophyton rubrum (76.5%), followed by T. mentagrophytes complex (6.3%), Microsporum canis (5.8%), T. violaceum (5.3%), T. verrucosum (0.83%) and Epidermophyton floccosum (0.3%). Zoophilic agents have become more prevalent and their frequency has been increased from 6.46% in 1999 to 13% in 2019. It is interesting to note that M. canis has been on the rise since 2010 and it was the first etiological agent of tinea capitis (48%), while infections caused by T. violaceum continued to decrease from 1999 (16.2%) to 2019 (4.7%). Other dermatophytes have been rarely isolated: T. tonsurans (9 cases), T. schoenleinii (3 cases), T. soudanense (2 cases), M. fulvum (1 case), M. audouinii (1 case) and M. ferrugineum (2 cases).T. mentagrophytes var. quinckeanum was isolated from an inflammatory tinea capitis lesion in an a-3-year-old girl. T. mentagrophytes var. erinacei was isolated from the first case of tinea manuum, in-a-10-year-old girl. The same fungus was isolated from the hair and scales of the hedgehog. Our study showed significant changes in the dermatophytes spectrum in our region. The prevalence of zoophilic species increased in recent years due to people's behavioral changes with the adoption of pets and animal husbandry in urban settings. Molecular methods are often crucial that help us to refine the identification strains of dermatophytes and to identify their origin of the contamination.Keywords: dermatophytoses, PCR-sequencing, spectrum, Sfax, Tunisia
Procedia PDF Downloads 1131410 Central Energy Management for Optimizing Utility Grid Power Exchange with a Network of Smart Homes
Authors: Sima Aznavi, Poria Fajri, Hanif Livani
Abstract:
Smart homes are small energy systems which may be equipped with renewable energy sources, storage devices, and loads. Energy management strategy plays a main role in the efficient operation of smart homes. Effective energy scheduling of the renewable energy sources and storage devices guarantees efficient energy management in households while reducing the energy imports from the grid. Nevertheless, despite such strategies, independently day ahead energy schedules for multiple households can cause undesired effects such as high power exchange with the grid at certain times of the day. Therefore, the interactions between multiple smart home day ahead energy projections is a challenging issue in a smart grid system and if not managed appropriately, the imported energy from the power network can impose additional burden on the distribution grid. In this paper, a central energy management strategy for a network consisting of multiple households each equipped with renewable energy sources, storage devices, and Plug-in Electric Vehicles (PEV) is proposed. The decision-making strategy alongside the smart home energy management system, minimizes the energy purchase cost of the end users, while at the same time reducing the stress on the utility grid. In this approach, the smart home energy management system determines different operating scenarios based on the forecasted household daily load and the components connected to the household with the objective of minimizing the end user overall cost. Then, selected projections for each household that are within the same cost range are sent to the central decision-making system. The central controller then organizes the schedules to reduce the overall peak to average ratio of the total imported energy from the grid. To validate this approach simulations are carried out for a network of five smart homes with different load requirements and the results confirm that by applying the proposed central energy management strategy, the overall power demand from the grid can be significantly flattened. This is an effective approach to alleviate the stress on the network by distributing its energy to a network of multiple households over a 24- hour period.Keywords: energy management, renewable energy sources, smart grid, smart home
Procedia PDF Downloads 2481409 Mobile Marketing Adoption in Pakistan
Authors: Manzoor Ahmad
Abstract:
The rapid advancement of mobile technology has transformed the way businesses engage with consumers, making mobile marketing a crucial strategy for organizations worldwide. This paper presents a comprehensive study on the adoption of mobile marketing in Pakistan, aiming to provide valuable insights into the current landscape, challenges, and opportunities in this emerging market. To achieve this objective, a mixed-methods approach was employed, combining quantitative surveys and qualitative interviews with industry experts, marketers, and consumers. The study encompassed a diverse range of sectors, including retail, telecommunications, banking, and e-commerce, ensuring a comprehensive understanding of mobile marketing practices across different industries. The findings indicate that mobile marketing has gained significant traction in Pakistan, with a growing number of organizations recognizing its potential for reaching and engaging with consumers effectively. Factors such as increasing smartphone penetration, affordable data plans, and the rise of social media usage have contributed to the widespread adoption of mobile marketing strategies. However, several challenges and barriers to mobile marketing adoption were identified. These include issues related to data privacy and security, limited digital literacy among consumers, inadequate infrastructure, and cultural considerations. Additionally, the study highlights the need for tailored and localized mobile marketing strategies to address the diverse cultural and linguistic landscape of Pakistan. Based on the insights gained from the study, practical recommendations are provided to support organizations in optimizing their mobile marketing efforts in Pakistan. These recommendations encompass areas such as consumer targeting, content localization, mobile app development, personalized messaging, and measurement of mobile marketing effectiveness. This research contributes to the existing literature on mobile marketing adoption in developing countries and specifically sheds light on the unique dynamics of the Pakistani market. It serves as a valuable resource for marketers, practitioners, and policymakers seeking to leverage mobile marketing strategies in Pakistan, ultimately fostering the growth and success of businesses operating in this region.Keywords: mobile marketing, digital marketing, mobile advertising, adoption of mobile marketing
Procedia PDF Downloads 1091408 Ultrasound Disintegration as a Potential Method for the Pre-Treatment of Virginia Fanpetals (Sida hermaphrodita) Biomass before Methane Fermentation Process
Authors: Marcin Dębowski, Marcin Zieliński, Mirosław Krzemieniewski
Abstract:
As methane fermentation is a complex series of successive biochemical transformations, its subsequent stages are determined, to a various extent, by physical and chemical factors. A specific state of equilibrium is being settled in the functioning fermentation system between environmental conditions and the rate of biochemical reactions and products of successive transformations. In the case of physical factors that influence the effectiveness of methane fermentation transformations, the key significance is ascribed to temperature and intensity of biomass agitation. Among the chemical factors, significant are pH value, type, and availability of the culture medium (to put it simply: the C/N ratio) as well as the presence of toxic substances. One of the important elements which influence the effectiveness of methane fermentation is the pre-treatment of organic substrates and the mode in which the organic matter is made available to anaerobes. Out of all known and described methods for organic substrate pre-treatment before methane fermentation process, the ultrasound disintegration is one of the most interesting technologies. Investigations undertaken on the ultrasound field and the use of installations operating on the existing systems result principally from very wide and universal technological possibilities offered by the sonication process. This physical factor may induce deep physicochemical changes in ultrasonicated substrates that are highly beneficial from the viewpoint of methane fermentation processes. In this case, special role is ascribed to disintegration of biomass that is further subjected to methane fermentation. Once cell walls are damaged, cytoplasm and cellular enzymes are released. The released substances – either in dissolved or colloidal form – are immediately available to anaerobic bacteria for biodegradation. To ensure the maximal release of organic matter from dead biomass cells, disintegration processes are aimed to achieve particle size below 50 μm. It has been demonstrated in many research works and in systems operating in the technical scale that immediately after substrate supersonication the content of organic matter (characterized by COD, BOD5 and TOC indices) was increasing in the dissolved phase of sedimentation water. This phenomenon points to the immediate sonolysis of solid substances contained in the biomass and to the release of cell material, and consequently to the intensification of the hydrolytic phase of fermentation. It results in a significant reduction of fermentation time and increased effectiveness of production of gaseous metabolites of anaerobic bacteria. Because disintegration of Virginia fanpetals biomass via ultrasounds applied in order to intensify its conversion is a novel technique, it is often underestimated by exploiters of agri-biogas works. It has, however, many advantages that have a direct impact on its technological and economical superiority over thus far applied methods of biomass conversion. As for now, ultrasound disintegrators for biomass conversion are not produced on the mass-scale, but by specialized groups in scientific or R&D centers. Therefore, their quality and effectiveness are to a large extent determined by their manufacturers’ knowledge and skills in the fields of acoustics and electronic engineering.Keywords: ultrasound disintegration, biomass, methane fermentation, biogas, Virginia fanpetals
Procedia PDF Downloads 3681407 Challenging the Standard 24 Equal Quarter Tones Theory in Arab Music: A Case Study of Tetrachords Bayyātī and ḤIjāz
Authors: Nabil Shair
Abstract:
Arab music maqām (Arab modal framework) is founded, among other main characteristics, on microtonal intervals. Notwithstanding the importance and multifaceted nature of intonation in Arab music, there is a paucity of studies examining this subject based on scientific and quantitative approaches. The present-day theory concerning the Arab tone system is largely based on the pioneering treatise of Mīkhā’īl Mashāqah (1840), which proposes the theoretical division of the octave into 24 equal quarter tones. This kind of equal-tempered division is incompatible with the performance practice of Arab music, as many professional Arab musicians conceptualize additional pitches beyond the standard 24 notes per octave. In this paper, we refute the standard theory presenting the scale of well-tempered quarter tones by implementing a quantitative analysis of the performed intonation of two prominent tetrachords in Arab music, namely bayyātī and ḥijāz. This analysis was conducted with the help of advanced computer programs, such as Sonic Visualiser and Tony, by which we were able to obtain precise frequency data (Hz) of each tone every 0.01 second. As a result, the value (in cents) of all three intervals of each tetrachord was measured and accordingly compared to the theoretical intervals. As a result, a specific distribution of a range of deviation from the equal-tempered division of the octave was detected, especially the detection of a diminished first interval of bayyātí and diminished second interval of ḥijāz. These types of intonation entail a considerable amount of flexibility, mainly influenced by surrounding tones, direction and function of the measured tone, ornaments, text, personal style of the performer and interaction with the audience. This paper seeks to contribute to the existing literature dealing with intonation in Arab music, as it is a vital part of the performance practice of this musical tradition. In addition, the insights offered by this paper and its novel methodology might also contribute to the broadening of the existing pedagogic methods used to teach Arab music.Keywords: Arab music, intonation, performance practice, music theory, oral music, octave division, tetrachords, music of the middle east, music history, musical intervals
Procedia PDF Downloads 531406 Multi-Criteria Selection and Improvement of Effective Design for Generating Power from Sea Waves
Authors: Khaled M. Khader, Mamdouh I. Elimy, Omayma A. Nada
Abstract:
Sustainable development is the nominal goal of most countries at present. In general, fossil fuels are the development mainstay of most world countries. Regrettably, the fossil fuel consumption rate is very high, and the world is facing the problem of conventional fuels depletion soon. In addition, there are many problems of environmental pollution resulting from the emission of harmful gases and vapors during fuel burning. Thus, clean, renewable energy became the main concern of most countries for filling the gap between available energy resources and their growing needs. There are many renewable energy sources such as wind, solar and wave energy. Energy can be obtained from the motion of sea waves almost all the time. However, power generation from solar or wind energy is highly restricted to sunny periods or the availability of suitable wind speeds. Moreover, energy produced from sea wave motion is one of the cheapest types of clean energy. In addition, renewable energy usage of sea waves guarantees safe environmental conditions. Cheap electricity can be generated from wave energy using different systems such as oscillating bodies' system, pendulum gate system, ocean wave dragon system and oscillating water column device. In this paper, a multi-criteria model has been developed using Analytic Hierarchy Process (AHP) to support the decision of selecting the most effective system for generating power from sea waves. This paper provides a widespread overview of the different design alternatives for sea wave energy converter systems. The considered design alternatives have been evaluated using the developed AHP model. The multi-criteria assessment reveals that the off-shore Oscillating Water Column (OWC) system is the most appropriate system for generating power from sea waves. The OWC system consists of a suitable hollow chamber at the shore which is completely closed except at its base which has an open area for gathering moving sea waves. Sea wave's motion pushes the air up and down passing through a suitable well turbine for generating power. Improving the power generation capability of the OWC system is one of the main objectives of this research. After investigating the effect of some design modifications, it has been concluded that selecting the appropriate settings of some effective design parameters such as the number of layers of Wells turbine fans and the intermediate distance between the fans can result in significant improvements. Moreover, simple dynamic analysis of the Wells turbine is introduced. Furthermore, this paper strives for comparing the theoretical and experimental results of the built experimental prototype.Keywords: renewable energy, oscillating water column, multi-criteria selection, Wells turbine
Procedia PDF Downloads 1631405 Experimental Analysis on Heat Transfer Enhancement in Double Pipe Heat Exchanger Using Al2O3/Water Nanofluid and Baffled Twisted Tape Inserts
Authors: Ratheesh Radhakrishnan, P. C. Sreekumar, K. Krishnamoorthy
Abstract:
Heat transfer augmentation techniques ultimately results in the reduction of thermal resistance in a conventional heat exchanger by generating higher convective heat transfer coefficient. It also results in reduction of size, increase in heat duty, decrease in approach temperature difference and reduction in pumping power requirements for heat exchangers. Present study deals with compound augmentation technique, which is not widely used. The study deals with the use of Alumina (Al2O3)/water nanofluid and baffled twisted tape inserts in double pipe heat exchanger as compound augmentation technique. Experiments were conducted to evaluate the heat transfer coefficient and friction factor for the flow through the inner tube of heat exchanger in turbulent flow range (80001404 A Qualitative Study Identifying the Complexities of Early Childhood Professionals' Use and Production of Data
Authors: Sara Bonetti
Abstract:
The use of quantitative data to support policies and justify investments has become imperative in many fields including the field of education. However, the topic of data literacy has only marginally touched the early care and education (ECE) field. In California, within the ECE workforce, there is a group of professionals working in policy and advocacy that use quantitative data regularly and whose educational and professional experiences have been neglected by existing research. This study aimed at analyzing these experiences in accessing, using, and producing quantitative data. This study utilized semi-structured interviews to capture the differences in educational and professional backgrounds, policy contexts, and power relations. The participants were three key professionals from county-level organizations and one working at a State Department to allow for a broader perspective at systems level. The study followed Núñez’s multilevel model of intersectionality. The key in Núñez’s model is the intersection of multiple levels of analysis and influence, from the individual to the system level, and the identification of institutional power dynamics that perpetuate the marginalization of certain groups within society. In a similar manner, this study looked at the dynamic interaction of different influences at individual, organizational, and system levels that might intersect and affect ECE professionals’ experiences with quantitative data. At the individual level, an important element identified was the participants’ educational background, as it was possible to observe a relationship between that and their positionality, both with respect to working with data and also with respect to their power within an organization and at the policy table. For example, those with a background in child development were aware of how their formal education failed to train them in the skills that are necessary to work in policy and advocacy, and especially to work with quantitative data, compared to those with a background in administration and/or business. At the organizational level, the interviews showed a connection between the participants’ position within the organization and their organization’s position with respect to others and their degree of access to quantitative data. This in turn affected their sense of empowerment and agency in dealing with data, such as shaping what data is collected and available. These differences reflected on the interviewees’ perceptions and expectations for the ECE workforce. For example, one of the interviewees pointed out that many ECE professionals happen to use data out of the necessity of the moment. This lack of intentionality is a cause for, and at the same time translates into missed training opportunities. Another interviewee pointed out issues related to the professionalism of the ECE workforce by remarking the inadequacy of ECE students’ training in working with data. In conclusion, Núñez’s model helped understand the different elements that affect ECE professionals’ experiences with quantitative data. In particular, what was clear is that these professionals are not being provided with the necessary support and that we are not being intentional in creating data literacy skills for them, despite what is asked of them and their work.Keywords: data literacy, early childhood professionals, intersectionality, quantitative data
Procedia PDF Downloads 2531403 Assessing Sydney Tar Ponds Remediation and Natural Sediment Recovery in Nova Scotia, Canada
Authors: Tony R. Walker, N. Devin MacAskill, Andrew Thalhiemer
Abstract:
Sydney Harbour, Nova Scotia has long been subject to effluent and atmospheric inputs of metals, polycyclic aromatic hydrocarbons (PAHs), and polychlorinated biphenyls (PCBs) from a large coking operation and steel plant that operated in Sydney for nearly a century until closure in 1988. Contaminated effluents from the industrial site resulted in the creation of the Sydney Tar Ponds, one of Canada’s largest contaminated sites. Since its closure, there have been several attempts to remediate this former industrial site and finally, in 2004, the governments of Canada and Nova Scotia committed to remediate the site to reduce potential ecological and human health risks to the environment. The Sydney Tar Ponds and Coke Ovens cleanup project has become the most prominent remediation project in Canada today. As an integral part of remediation of the site (i.e., which consisted of solidification/stabilization and associated capping of the Tar Ponds), an extensive multiple media environmental effects program was implemented to assess what effects remediation had on the surrounding environment, and, in particular, harbour sediments. Additionally, longer-term natural sediment recovery rates of select contaminants predicted for the harbour sediments were compared to current conditions. During remediation, potential contributions to sediment quality, in addition to remedial efforts, were evaluated which included a significant harbour dredging project, propeller wash from harbour traffic, storm events, adjacent loading/unloading of coal and municipal wastewater treatment discharges. Two sediment sampling methodologies, sediment grab and gravity corer, were also compared to evaluate the detection of subtle changes in sediment quality. Results indicated that overall spatial distribution pattern of historical contaminants remains unchanged, although at much lower concentrations than previously reported, due to natural recovery. Measurements of sediment indicator parameter concentrations confirmed that natural recovery rates of Sydney Harbour sediments were in broad agreement with predicted concentrations, in spite of ongoing remediation activities. Overall, most measured parameters in sediments showed little temporal variability even when using different sampling methodologies, during three years of remediation compared to baseline, except for the detection of significant increases in total PAH concentrations noted during one year of remediation monitoring. The data confirmed the effectiveness of mitigation measures implemented during construction relative to harbour sediment quality, despite other anthropogenic activities and the dynamic nature of the harbour.Keywords: contaminated sediment, monitoring, recovery, remediation
Procedia PDF Downloads 2361402 The Re-Emergence of Russia Foreign Policy (Case Study: Middle East)
Authors: Maryam Azish
Abstract:
Russia, as an emerging global player in recent years, has projected a special place in the Middle East. Despite all the challenges it has faced over the years, it has always considered its presence in various fields with a strategy that has defined its maneuvering power as a level of competition and even confrontation with the United States. Therefore, its current approach is considered important as an influential actor in the Middle East. After the collapse of the Soviet Union, when the Russians withdrew completely from the Middle East, the American scene remained almost unrivaled by the Americans. With the start of the US-led war in Iraq and Afghanistan and the subsequent developments that led to the US military and political defeat, a new chapter in regional security was created in which ISIL and Taliban terrorism went along with the Arab Spring to destabilize the Middle East. Because of this, the Americans took every opportunity to strengthen their military presence. Iraq, Syria and Afghanistan have always been the three areas where terrorism was shaped, and the countries of the region have each reacted to this evil phenomenon accordingly. The West dealt with this phenomenon on a case-by-case basis in the general circumstances that created the fluid situation in the Arab countries and the region. Russian President Vladimir Putin accused the US of falling asleep in the face of ISIS and terrorism in Syria. In fact, this was an opportunity for the Russians to revive their presence in Syria. This article suggests that utilizing the recognition policy along with the constructivism theory will offer a better knowledge of Russia’s endeavors to endorse its international position. Accordingly, Russia’s distinctiveness and its ambitions for a situation of great power have played a vital role in shaping national interests and, subsequently, in foreign policy, in Putin's era in particular. The focal claim of the paper is that scrutinize Russia’s foreign policy with realistic methods cannot be attained. Consequently, with an aim to fill the prevailing vacuum, this study exploits the politics of acknowledgment in the context of constructivism to examine Russia’s foreign policy in the Middle East. The results of this paper show that the key aim of Russian foreign policy discourse, accompanied by increasing power and wealth, is to recognize and reinstate the position of great power in the universal system. The Syrian crisis has created an opportunity for Russia to unite its position in the developing global and regional order after ages of dynamic and prevalent existence in the Middle East as well as contradicting US unilateralism. In the meantime, the writer thinks that the question of identifying Russia’s position in the global system by the West has played a foremost role in serving its national interests.Keywords: constructivism, foreign Policy, middle East, Russia, regionalism
Procedia PDF Downloads 1491401 Isolation and Molecular Characterization of Lytic Bacteriophage against Carbapenem Resistant Klebsiella pneumoniae
Authors: Guna Raj Dhungana, Roshan Nepal, Apshara Parajuli, , Archana Maharjan, Shyam K. Mishra, Pramod Aryal, Rajani Malla
Abstract:
Introduction: Klebsiella pneumoniae is a well-known opportunistic human pathogen, primarily causing healthcare-associated infections. The global emergence of carbapenemase-producing K. pneumoniaeis a major public health burden, which is often extensively multidrug resistant.Thus, because of the difficulty to treat these ‘superbug’ and menace and some term as ‘apocalypse’ of post antibiotics era, an alternative approach to controlling this pathogen is prudent and one of the approaches is phage mediated control and/or treatment. Objective: In this study, we aimed to isolate novel bacteriophage against carbapenemase-producing K. pneumoniaeand characterize for potential use inphage therapy. Material and Methods: Twenty lytic phages were isolated from river water using double layer agar assay and purified. Biological features, physiochemical characters, burst size, host specificity and activity spectrum of phages were determined. One most potent phage: Phage TU_Kle10O was selected and characterized by electron microscopy. Whole genome sequences of the phage were analyzed for presence/absence of virulent factors, and other lysin genes. Results: Novel phage TU_Kle10O showed multiple host range within own genus and did not induce any BIM up to 5th generation of host’s life cycle. Electron microscopy confirmed that the phage was tailed and belonged to Caudovirales family. Next generation sequencing revealed its genome to be 166.2 Kb. bioinformatical analysis further confirmed that the phage genome ‘did not’ contain any ‘bacterial genes’ within phage genome, which ruled out the concern for transfer of virulent genes. Specific 'lysin’ enzyme was identified phages which could be used as 'antibiotics'. Conclusion: Extensively multidrug resistant bacteria like carbapenemase-producing K. pneumoniaecould be treated efficiently by phages.Absence of ‘virulent’ genes of bacterial origin and presence of lysin proteins within phage genome makes phages an excellent candidate for therapeutics.Keywords: bacteriophage, Klebsiella pneumoniae, MDR, phage therapy, carbapenemase,
Procedia PDF Downloads 1901400 A Low-Cost Long-Range 60 GHz Backhaul Wireless Communication System
Authors: Atabak Rashidian
Abstract:
In duplex backhaul wireless communication systems, two separate transmit and receive high-gain antennas are required if an antenna switch is not implemented. Although the switch loss, which is considerable and in the order of 1.5 dB at 60 GHz, is avoided, the large separate antenna systems make the design bulky and not cost-effective. To avoid two large reflectors for such a system, transmit and receive antenna feeds with a common phase center are required. The phase center should coincide with the focal point of the reflector to maximize the efficiency and gain. In this work, we present an ultra-compact design in which stacked patch antennas are used as the feeds for a 12-inch reflector. The transmit antenna is a 1 × 2 array and the receive antenna is a single element located in the middle of the transmit antenna elements. Antenna elements are designed as stacked patches to provide the required impedance bandwidth for four standard channels of WiGigTM applications. The design includes three metallic layers and three dielectric layers, in which the top dielectric layer is a 100 µm-thick protective layer. The top two metallic layers are specified to the main and parasitic patches. The bottom layer is basically ground plane with two circular openings (0.7 mm in diameter) having a center through via which connects the antennas to a single input/output Si-Ge Bi-CMOS transceiver chip. The reflection coefficient of the stacked patch antenna is fully investigated. The -10 dB impedance bandwidth is about 11%. Although the gap between transmit and receive antenna is very small (g = 0.525 mm), the mutual coupling is less than -12 dB over the desired frequency band. The three dimensional radiation patterns of the transmit and receive reflector antennas at 60 GHz is investigated over the impedance bandwidth. About 39 dBi realized gain is achieved. Considering over 15 dBm of output power of the silicon chip in the transmit side, the EIRP should be over 54 dBm, which is good enough for over one kilometer multi Gbps data communications. The performance of the reflector antenna over the bandwidth shows the peak gain is 39 dBi and 40 dBi for the reflector antenna with 2-element and single element feed, respectively. This type of the system design is cost-effective and efficient.Keywords: Antenna, integrated circuit, millimeter-wave, phase center
Procedia PDF Downloads 1221399 Study of Climate Change Process on Hyrcanian Forests Using Dendroclimatology Indicators (Case Study of Guilan Province)
Authors: Farzad Shirzad, Bohlol Alijani, Mehry Akbary, Mohammad Saligheh
Abstract:
Climate change and global warming are very important issues today. The process of climate change, especially changes in temperature and precipitation, is the most important issue in the environmental sciences. Climate change means changing the averages in the long run. Iran is located in arid and semi-arid regions due to its proximity to the equator and its location in the subtropical high pressure zone. In this respect, the Hyrcanian forest is a green necklace between the Caspian Sea and the south of the Alborz mountain range. In the forty-third session of UNESCO, it was registered as the second natural heritage of Iran. Beech is one of the most important tree species and the most industrial species of Hyrcanian forests. In this research, using dendroclimatology, the width of the tree ring, and climatic data of temperature and precipitation from Shanderman meteorological station located in the study area, And non-parametric Mann-Kendall statistical method to investigate the trend of climate change over a time series of 202 years of growth ringsAnd Pearson statistical method was used to correlate the growth of "ring" growth rings of beech trees with climatic variables in the region. The results obtained from the time series of beech growth rings showed that the changes in beech growth rings had a downward and negative trend and were significant at the level of 5% and climate change occurred. The average minimum, medium, and maximum temperatures and evaporation in the growing season had an increasing trend, and the annual precipitation had a decreasing trend. Using Pearson method during fitting the correlation of diameter of growth rings with temperature, for the average in July, August, and September, the correlation is negative, and the average temperature in July, August, and September is negative, and for the average The average maximum temperature in February was correlation-positive and at the level of 95% was significant, and with precipitation, in June the correlation was at the level of 95% positive and significant.Keywords: climate change, dendroclimatology, hyrcanian forest, beech
Procedia PDF Downloads 1041398 Assessing Children’s Probabilistic and Creative Thinking in a Non-formal Learning Context
Authors: Ana Breda, Catarina Cruz
Abstract:
Daily, we face unpredictable events, often attributed to chance, as there is no justification for such an occurrence. Chance, understood as a source of uncertainty, is present in several aspects of human life, such as weather forecasts, dice rolling, and lottery. Surprisingly, humans and some animals can quickly adjust their behavior to handle efficiently doubly stochastic processes (random events with two layers of randomness, like unpredictable weather affecting dice rolling). This adjustment ability suggests that the human brain has built-in mechanisms for perceiving, understanding, and responding to simple probabilities. It also explains why current trends in mathematics education include probability concepts in official curriculum programs, starting from the third year of primary education onwards. In the first years of schooling, children learn to use a certain type of (specific) vocabulary, such as never, always, rarely, perhaps, likely, and unlikely, to help them to perceive and understand the probability of some events. These are keywords of crucial importance for their perception and understanding of probabilities. The development of the probabilistic concepts comes from facts and cause-effect sequences resulting from the subject's actions, as well as the notion of chance and intuitive estimates based on everyday experiences. As part of a junior summer school program, which took place at a Portuguese university, a non-formal learning experiment was carried out with 18 children in the 5th and 6th grades. This experience was designed to be implemented in a dynamic of a serious ice-breaking game, to assess their levels of probabilistic, critical, and creative thinking in understanding impossible, certain, equally probable, likely, and unlikely events, and also to gain insight into how the non-formal learning context influenced their achievements. The criteria used to evaluate probabilistic thinking included the creative ability to conceive events classified in the specified categories, the ability to properly justify the categorization, the ability to critically assess the events classified by other children, and the ability to make predictions based on a given probability. The data analysis employs a qualitative, descriptive, and interpretative-methods approach based on students' written productions, audio recordings, and researchers' field notes. This methodology allowed us to conclude that such an approach is an appropriate and helpful formative assessment tool. The promising results of this initial exploratory study require a future research study with children from these levels of education, from different regions, attending public or private schools, to validate and expand our findings.Keywords: critical and creative thinking, non-formal mathematics learning, probabilistic thinking, serious game
Procedia PDF Downloads 271397 Hansen Solubility Parameters, Quality by Design Tool for Developing Green Nanoemulsion to Eliminate Sulfamethoxazole from Contaminated Water
Authors: Afzal Hussain, Mohammad A. Altamimi, Syed Sarim Imam, Mudassar Shahid, Osamah Abdulrahman Alnemer
Abstract:
Exhaustive application of sulfamethoxazole (SUX) became as a global threat for human health due to water contamination through diverse sources. The addressed combined application of Hansen solubility (HSPiP software) parameters and Quality by Design tool for developing various green nanoemulsions. HSPiP program assisted to screen suitable excipients based on Hansen solubility parameters and experimental solubility data. Various green nanoemulsions were prepared and characterized for globular size, size distribution, zeta potential, and removal efficiency. Design Expert (DoE) software further helped to identify critical factors responsible to have direct impact on percent removal efficiency, size, and viscosity. Morphological investigation was visualized under transmission electron microscopy (TEM). Finally, the treated was studied to negate the presence of the tested drug employing ICP-OES (inductively coupled plasma optical emission microscopy) technique and HPLC (high performance liquid chromatography). Results showed that HSPiP predicted biocompatible lipid, safe surfactant (lecithin), and propylene glycol (PG). Experimental solubility of the drug in the predicted excipients were quite convincing and vindicated. Various green nanoemulsions were fabricated, and these were evaluated for in vitro findings. Globular size (100-300 nm), PDI (0.1-0.5), zeta potential (~ 25 mV), and removal efficiency (%RE = 70-98%) were found to be in acceptable range for deciding input factors with level in DoE. Experimental design tool assisted to identify the most critical variables controlling %RE and optimized content of nanoemulsion under set constraints. Dispersion time was varied from 5-30 min. Finally, ICP-OES and HPLC techniques corroborated the absence of SUX in the treated water. Thus, the strategy is simple, economic, selective, and efficient.Keywords: quality by design, sulfamethoxazole, green nanoemulsion, water treatment, icp-oes, hansen program (hspip software
Procedia PDF Downloads 821396 Creation of a Trust-Wide, Cross-Speciality, Virtual Teaching Programme for Doctors, Nurses and Allied Healthcare Professionals
Authors: Nelomi Anandagoda, Leanne J. Eveson
Abstract:
During the COVID-19 pandemic, the surge in in-patient admissions across the medical directorate of a district general hospital necessitated the implementation of an incident rota. Conscious of the impact on training and professional development, the idea of developing a virtual teaching programme was conceived. The programme initially aimed to provide junior doctors, specialist nurses, pharmacists, and allied healthcare professionals from medical specialties and those re-deployed from other specialties (e.g., ophthalmology, GP, surgery, psychiatry) the knowledge and skills to manage the deteriorating patient with COVID-19. The programme was later developed to incorporate the general internal medicine curriculum. To facilitate continuing medical education whilst maintaining social distancing during this period, a virtual platform was used to deliver teaching to junior doctors across two large district general hospitals and two community hospitals. Teaching sessions were recorded and uploaded to a common platform, providing a resource for participants to catch up on and re-watch teaching sessions, making strides towards reducing discrimination against the professional development of less than full-time trainees. Thus, creating a learning environment, which is inclusive and accessible to adult learners in a self-directed manner. The negative impact of the pandemic on the well-being of healthcare professionals is well documented. To support the multi-disciplinary team, the virtual teaching programme evolved to included sessions on well-being, resilience, and work-life balance. Providing teaching for learners across the multi-disciplinary team (MDT) has been an eye-opening experience. By challenging the concept that learners should only be taught within their own peer groups, the authors have fostered a greater appreciation of the strengths of the MDT and showcased the immense wealth of expertise available within the trust. The inclusive nature of the teaching and the ease of joining a virtual teaching session has facilitated the dissemination of knowledge across the MDT, thus improving patient care on the frontline. The weekly teaching programme has been running for over eight months, with ongoing engagement, interest, and participation. As described above, the teaching programme has evolved to accommodate the needs of its learners. It has received excellent feedback with an appreciation of its inclusive, multi-disciplinary, and holistic nature. The COVID-19 pandemic provided a catalyst to rapidly develop novel methods of working and training and widened access/exposure to the virtual technologies available to large organisations. By merging pedagogical expertise and technology, the authors have created an effective online learning environment. Although the authors do not propose to replace face-to-face teaching altogether, this model of virtual multidisciplinary team, cross-site teaching has proven to be a great leveler. It has made high-quality teaching accessible to learners of different confidence levels, grades, specialties, and working patterns.Keywords: cross-site, cross-speciality, inter-disciplinary, multidisciplinary, virtual teaching
Procedia PDF Downloads 1701395 Epidemiological Analysis of the Patients Supplied with Foot Orthoses in Ortho-Prosthetic Center of Kosovo
Authors: Ardiana Murtezani, Ilirijana Dallku, Teuta Osmani Vllasolli, Sabit Sllamniku
Abstract:
Background: The use of foot orthoses are always indicated when there are alterations of the optimal biomechanics' position of the foot. Orthotics are very effective and very suitable for the majority of patients with pain due to overload which can be related to biomechanical disorders. Aim: To assess the frequency of patients requiring foot orthoses, type of orthoses and analysis of their disease leading to the use of foot orthoses. Material and Methods: Our study included 128 patients with various foot pathologies, treated at the outpatient department of the Ortho-Prosthetic Center of Kosovo (OPCK) in Prishtina. Prospective-descriptive clinical method was used during this study. Functional status of patients was examined, and the following parameters are noted: range of motion measurements for the affected joints/lower extremities, manual test for muscular strength below the knee and foot of the affected extremity, perimeter measurements of the lower extremities, measurements of lower extremities, foot length measurement, foot width measurements and size. In order to complete the measurements the following instruments are used: plantogram, pedogram, meter and cork shoe lift appliances. Results: The majority of subjects in this study are male (60.2% vs. 39.8%), and the dominant age group was 0-9 (47.7%), 61 subjects respectively. Most frequent foot disorders were: congenital disease 60.1%, trauma cases 13.3%, consequences from rheumatologic disease 12.5%, neurologic dysfunctions 11.7%, and the less frequented are the infectious cases 1.6%. Congenital anomalies were the most frequent cases, and from this group majority of cases suffered from pes planovalgus (37.5%), eqinovarus (15.6%) and discrepancies between extremities (6.3%). Furthermore, traumatic amputations (2.3%) and arthritis (0.8%). As far as neurologic disease, subjects with cerebral palsy are represented with (3.1%), peroneal nerve palsy (2.3%) and hemiparesis (1.6%). Infectious disease osteomyelitis sequels are represented with (1.6%). Conclusion: Based on our study results, we have concluded that the use of foot orthoses for patients suffering from rheumatoid arthritis and nonspecific arthropaty was effective treatment choice, leading to decrease of pain, less deformities and improves the quality of life.Keywords: orthoses, epidemiological analysis, rheumatoid arthritis, rehabilitation
Procedia PDF Downloads 2321394 Analyzing Consumer Preferences and Brand Differentiation in the Notebook Market via Social Media Insights and Expert Evaluations
Authors: Mohammadreza Bakhtiari, Mehrdad Maghsoudi, Hamidreza Bakhtiari
Abstract:
This study investigates consumer behavior in the notebook computer market by integrating social media sentiment analysis with expert evaluations. The rapid evolution of the notebook industry has intensified competition among manufacturers, necessitating a deeper understanding of consumer priorities. Social media platforms, particularly Twitter, have become valuable sources for capturing real-time user feedback. In this research, sentiment analysis was performed on Twitter data gathered in the last two years, focusing on seven major notebook brands. The PyABSA framework was utilized to extract sentiments associated with various notebook components, including performance, design, battery life, and price. Expert evaluations, conducted using fuzzy logic, were incorporated to assess the impact of these sentiments on purchase behavior. To provide actionable insights, the TOPSIS method was employed to prioritize notebook features based on a combination of consumer sentiments and expert opinions. The findings consistently highlight price, display quality, and core performance components, such as RAM and CPU, as top priorities across brands. However, lower-priority features, such as webcams and cooling fans, present opportunities for manufacturers to innovate and differentiate their products. The analysis also reveals subtle but significant brand-specific variations, offering targeted insights for marketing and product development strategies. For example, Lenovo's strong performance in display quality points to a competitive edge, while Microsoft's lower ranking in battery life indicates a potential area for R&D investment. This hybrid methodology demonstrates the value of combining big data analytics with expert evaluations, offering a comprehensive framework for understanding consumer behavior in the notebook market. The study emphasizes the importance of aligning product development and marketing strategies with evolving consumer preferences, ensuring competitiveness in a dynamic market. It also underscores the potential for innovation in seemingly less important features, providing companies with opportunities to create unique selling points. By bridging the gap between consumer expectations and product offerings, this research equips manufacturers with the tools needed to remain agile in responding to market trends and enhancing customer satisfaction.Keywords: consumer behavior, customer preferences, laptop industry, notebook computers, social media analytics, TOPSIS
Procedia PDF Downloads 241393 Mechanical Transmission of Parasites by Cockroaches’ Collected from Urban Environment of Lahore, Pakistan
Authors: Hafsa Memona, Farkhanda Manzoor
Abstract:
Cockroaches are termed as medically important pests because of their wide distribution in human habitation including houses, hospitals, food industries and kitchens. They may harbor multiple drug resistant pathogenic bacteria and protozoan parasites on their external surfaces, disseminate on human food and cause serious diseases and allergies to human. Hence, they are regarded as mechanical vector in human habitation due to their nocturnal activity and nutritional behavior. Viable eggs and dormant cysts of parasites can hitch a ride on cockroaches. Ova and cysts of parasitic organism may settle into the crevices and cracks between thorax and head. There are so many fissures and clefts and crannies on a cockroach which provide site for these organisms. This study aimed with identifying role of cockroaches in mechanically transmitting and disseminating gastrointestinal parasites in two environmental settings; hospitals and houses in urban area of Lahore. Totally, 250 adult cockroaches were collected from houses and hospitals by sticky traps and food baited traps and screened for parasitic load. All cockroaches were captured during their feeding time in natural habitat. Direct wet smear, 1% lugols iodine and modified acid-fast bacilli staining were used to identify the parasites from the body surfaces of cockroaches. Among human habitation two common species of cockroaches were collected i.e. P. americana and B. germanica. The results showed that 112 (46.8%) cockroaches harbored at least one human intestinal parasite on their body surfaces. The cockroaches from hospital environment harboured more parasites than houses. 47 (33.57%) cockroaches from houses and 65 (59.09%) from hospitals were infected with parasitic organisms. Of these, 76 (67.85%) were parasitic protozoans and 36(32.15%) were pathogenic and non-pathogenic intestinal parasites. P. americana harboured more parasites as compared to B. germanica in both environment. Most common human intestinal parasites found on cockroaches include ova of Ascaris lumbricoides (giant roundworm), Trichuris trichura (whipworm), Anchylostoma deodunalae (hookworm), Enterobius vermicularis (pinworm), Taenia spp. and Strongyloides stercoralis (threadworm). The cysts of protozoans’ parasites including Balantidium coli, Entomoeba hystolitica, C. parvum, Isospora belli, Giardia duodenalis and C. cayetenensis were isolated and identified from cockroaches. Both experimental sites were significantly different in carriage of parasitic load on cockroaches. Difference in the hygienic condition of the environments, including human excrement disposal, variable habitat interacted, indoor and outdoor species, may account for the observed variation in the parasitic carriage rate of cockroaches among different experimental site. Thus a finding of this study is that Cockroaches are uniformly distributed in human habitation and act as a mechanical vector of pathogenic parasites that cause common illness such as diarrhea and bowel disorders. This fact contributes to epidemiological chain therefore control of cockroaches will significantly lessen the prevalence of illness in human. Effective control strategies will reduce the public health burden of the gastro-intestinal parasites in the developing countries.Keywords: cockroaches, health risks, hospitals, houses, parasites, protozoans, transmission
Procedia PDF Downloads 2811392 Automated Computer-Vision Analysis Pipeline of Calcium Imaging Neuronal Network Activity Data
Authors: David Oluigbo, Erik Hemberg, Nathan Shwatal, Wenqi Ding, Yin Yuan, Susanna Mierau
Abstract:
Introduction: Calcium imaging is an established technique in neuroscience research for detecting activity in neural networks. Bursts of action potentials in neurons lead to transient increases in intracellular calcium visualized with fluorescent indicators. Manual identification of cell bodies and their contours by experts typically takes 10-20 minutes per calcium imaging recording. Our aim, therefore, was to design an automated pipeline to facilitate and optimize calcium imaging data analysis. Our pipeline aims to accelerate cell body and contour identification and production of graphical representations reflecting changes in neuronal calcium-based fluorescence. Methods: We created a Python-based pipeline that uses OpenCV (a computer vision Python package) to accurately (1) detect neuron contours, (2) extract the mean fluorescence within the contour, and (3) identify transient changes in the fluorescence due to neuronal activity. The pipeline consisted of 3 Python scripts that could both be easily accessed through a Python Jupyter notebook. In total, we tested this pipeline on ten separate calcium imaging datasets from murine dissociate cortical cultures. We next compared our automated pipeline outputs with the outputs of manually labeled data for neuronal cell location and corresponding fluorescent times series generated by an expert neuroscientist. Results: Our results show that our automated pipeline efficiently pinpoints neuronal cell body location and neuronal contours and provides a graphical representation of neural network metrics accurately reflecting changes in neuronal calcium-based fluorescence. The pipeline detected the shape, area, and location of most neuronal cell body contours by using binary thresholding and grayscale image conversion to allow computer vision to better distinguish between cells and non-cells. Its results were also comparable to manually analyzed results but with significantly reduced result acquisition times of 2-5 minutes per recording versus 10-20 minutes per recording. Based on these findings, our next step is to precisely measure the specificity and sensitivity of the automated pipeline’s cell body and contour detection to extract more robust neural network metrics and dynamics. Conclusion: Our Python-based pipeline performed automated computer vision-based analysis of calcium image recordings from neuronal cell bodies in neuronal cell cultures. Our new goal is to improve cell body and contour detection to produce more robust, accurate neural network metrics and dynamic graphs.Keywords: calcium imaging, computer vision, neural activity, neural networks
Procedia PDF Downloads 821391 Improve Divers Tracking and Classification in Sonar Images Using Robust Diver Wake Detection Algorithm
Authors: Mohammad Tarek Al Muallim, Ozhan Duzenli, Ceyhun Ilguy
Abstract:
Harbor protection systems are so important. The need for automatic protection systems has increased over the last years. Diver detection active sonar has great significance. It used to detect underwater threats such as divers and autonomous underwater vehicle. To automatically detect such threats the sonar image is processed by algorithms. These algorithms used to detect, track and classify of underwater objects. In this work, divers tracking and classification algorithm is improved be proposing a robust wake detection method. To detect objects the sonar images is normalized then segmented based on fixed threshold. Next, the centroids of the segments are found and clustered based on distance metric. Then to track the objects linear Kalman filter is applied. To reduce effect of noise and creation of false tracks, the Kalman tracker is fine tuned. The tuning is done based on our active sonar specifications. After the tracks are initialed and updated they are subjected to a filtering stage to eliminate the noisy and unstable tracks. Also to eliminate object with a speed out of the diver speed range such as buoys and fast boats. Afterwards the result tracks are subjected to a classification stage to deiced the type of the object been tracked. Here the classification stage is to deice wither if the tracked object is an open circuit diver or a close circuit diver. At the classification stage, a small area around the object is extracted and a novel wake detection method is applied. The morphological features of the object with his wake is extracted. We used support vector machine to find the best classifier. The sonar training images and the test images are collected by ARMELSAN Defense Technologies Company using the portable diver detection sonar ARAS-2023. After applying the algorithm to the test sonar data, we get fine and stable tracks of the divers. The total classification accuracy achieved with the diver type is 97%.Keywords: harbor protection, diver detection, active sonar, wake detection, diver classification
Procedia PDF Downloads 2381390 Correlative Study of Serum Interleukin-18 and Disease Activity, Functional Disability and Quality of Life in Rheumatoid Arthritis Patients
Authors: Hamdy Khamis Korayem, Manal Yehia Tayel, Abeer Shawky El Hadedy, Emmanuel Kamal Aziz Saba, Shimaa Badr Abdelnaby Badr
Abstract:
The aim of the current study was to demonstrate whether serum Interleukin-18 (IL-18) is increased in rheumatoid arthritis (RA) and its correlation with disease activity, functional disability and quality of life in RA patients. The study included 30 RA patients and 20 healthy normal control subjects. The RA patients were diagnosed according to the 2010 ACR/EULAR classification criteria for RA with the exclusion of those who had diabetes mellitus, endocrine disorders, associated rheumatologic diseases, viral hepatitis B or C and other diseases with increased serum IL-18 level. All patients were subjected to clinical evaluation of the musculoskeletal system. Disease activity was assessed by disease activity score 28 with 4 variables (DAS 28). Functional disability was assessed by health assessment questionnaire disability index (HAQ-DI). The quality of life was assessed by Short form-36 (SF-36) questionnaire. Radiological assessment of both hands and feet by Sharp/van der Heijde (SvH) scoring method. Laboratory parameters including erythrocyte sedimentation rate (ESR), C-reactive protein (CRP), rheumatoid factor (RF) and anti-cyclic citrullinated peptide antibody (ACPA) were assessed in patients and serum level of IL-18 in both patients and control subjects. There was no statistically significant difference between patient and control group as regards age and sex. Among patients, 29 % were females and the age range was between 25 to 55 years. Extra-articular manifestations were presented in 56.7% of the patients. The mean of DAS 28 score was 5.73±1.46 and that of HAQ-DI was 1.22±0.72 while that of SF-36 was 40.03±13.96. The level of serum IL-18 was significantly higher in patients than in the control subjects (P= 0.030). Serum IL-18 was correlated with ACPA among the patient group. There were no statistically significant correlations between serum IL-18 and DAS28, HAQ-DI, SF-36, total SvH score and the other laboratory results. In conclusion, IL-18 is significantly higher in RA patient than in healthy control subjects and positively correlated with ACPA level. IL-18 is associated with extra-articular manifestations. However, it is not correlated with other laboratory parameters, disease activity, functional disability, quality of life nor radiological severity.Keywords: disease activity score, Interleukin-18, quality of life assessment, rheumatoid arthritis
Procedia PDF Downloads 3251389 Anodic Stability of Li₆PS₅Cl/PEO Composite Polymer Electrolytes for All-Solid-State Lithium Batteries: A First-Principles Molecular Dynamics Study
Authors: Hao-Wen Chang, Santhanamoorthi Nachimuthu, Jyh-Chiang Jiang
Abstract:
All-solid-state lithium batteries (ASSLBs) are increasingly recognized as a safer and more reliable alternative to conventional lithium-ion batteries due to their non-flammable nature and enhanced safety performance. ASSLBs utilize a range of solid-state electrolytes, including solid polymer electrolytes (SPEs), inorganic solid electrolytes (ISEs), and composite polymer electrolytes (CPEs). SPEs are particularly valued for their flexibility, ease of processing, and excellent interfacial compatibility with electrodes, though their ionic conductivity remains a significant limitation. ISEs, on the other hand, provide high ionic conductivity, broad electrochemical windows, and strong mechanical properties but often face poor interfacial contact with electrodes, impeding performance. CPEs, which merge the strengths of SPEs and ISEs, represent a compelling solution for next-generation ASSLBs by addressing both electrochemical and mechanical challenges. Despite their potential, the mechanisms governing lithium-ion transport within these systems remain insufficiently understood. In this study, we designed CPEs based on argyrodite-type Li₆PS₅Cl (LPSC) combined with two distinct polymer matrices: poly(ethylene oxide) (PEO) with 24.5 wt% lithium bis(trifluoromethane)sulfonimide (LiTFSI) and polycaprolactone (PCL) with 25.7 wt% LiTFSI. Through density functional theory (DFT) calculations, we investigated the interfacial chemistry of these materials, revealing critical insights into their stability and interactions. Additionally, ab initio molecular dynamics (AIMD) simulations of lithium electrodes interfaced with LPSC layers containing polymers and LiTFSI demonstrated that the polymer matrix significantly mitigates LPSC decomposition, compared to systems with only a lithium electrode and LPSC layers. These findings underscore the pivotal role of CPEs in improving the performance and longevity of ASSLBs, offering a promising path forward for next-generation energy storage technologies.Keywords: all-solid-state lithium-ion batteries, composite solid electrolytes, DFT calculations, Li-ion transport
Procedia PDF Downloads 20