Search results for: temporal-distance manipulation
48 Assessing the Material Determinants of Cavity Polariton Relaxation using Angle-Resolved Photoluminescence Excitation Spectroscopy
Authors: Elizabeth O. Odewale, Sachithra T. Wanasinghe, Aaron S. Rury
Abstract:
Cavity polaritons form when molecular excitons strongly couple to photons in carefully constructed optical cavities. These polaritons, which are hybrid light-matter states possessing a unique combination of photonic and excitonic properties, present the opportunity to manipulate the properties of various semiconductor materials. The systematic manipulation of materials through polariton formation could potentially improve the functionalities of many optoelectronic devices such as lasers, light-emitting diodes, photon-based quantum computers, and solar cells. However, the prospects of leveraging polariton formation for novel devices and device operation depend on more complete connections between the properties of molecular chromophores, and the hybrid light-matter states they form, which remains an outstanding scientific goal. Specifically, for most optoelectronic applications, it is paramount to understand how polariton formation affects the spectra of light absorbed by molecules coupled strongly to cavity photons. An essential feature of a polariton state is its dispersive energy, which occurs due to the enhanced spatial delocalization of the polaritons relative to bare molecules. To leverage the spatial delocalization of cavity polaritons, angle-resolved photoluminescence excitation spectroscopy was employed in characterizing light emission from the polaritonic states. Using lasers of appropriate energies, the polariton branches were resonantly excited to understand how molecular light absorption changes under different strong light-matter coupling conditions. Since an excited state has a finite lifetime, the photon absorbed by the polariton decays non-radiatively into lower-lying molecular states, from which radiative relaxation to the ground state occurs. The resulting fluorescence is collected across several angles of excitation incidence. By modeling the behavior of the light emission observed from the lower-lying molecular state and combining this result with the output of angle-resolved transmission measurements, inferences are drawn about how the behavior of molecules changes when they form polaritons. These results show how the intrinsic properties of molecules, such as the excitonic lifetime, affect the rate at which the polaritonic states relax. While it is true that the lifetime of the photon mediates the rate of relaxation in a cavity, the results from this study provide evidence that the lifetime of the molecular exciton also limits the rate of polariton relaxation.Keywords: flourescece, molecules in cavityies, optical cavity, photoluminescence excitation, spectroscopy, strong coupling
Procedia PDF Downloads 7447 Fabrication of High-Aspect Ratio Vertical Silicon Nanowire Electrode Arrays for Brain-Machine Interfaces
Authors: Su Yin Chiam, Zhipeng Ding, Guang Yang, Danny Jian Hang Tng, Peiyi Song, Geok Ing Ng, Ken-Tye Yong, Qing Xin Zhang
Abstract:
Brain-machine interfaces (BMI) is a ground rich of exploration opportunities where manipulation of neural activity are used for interconnect with myriad form of external devices. These research and intensive development were evolved into various areas from medical field, gaming and entertainment industry till safety and security field. The technology were extended for neurological disorders therapy such as obsessive compulsive disorder and Parkinson’s disease by introducing current pulses to specific region of the brain. Nonetheless, the work to develop a real-time observing, recording and altering of neural signal brain-machine interfaces system will require a significant amount of effort to overcome the obstacles in improving this system without delay in response. To date, feature size of interface devices and the density of the electrode population remain as a limitation in achieving seamless performance on BMI. Currently, the size of the BMI devices is ranging from 10 to 100 microns in terms of electrodes’ diameters. Henceforth, to accommodate the single cell level precise monitoring, smaller and denser Nano-scaled nanowire electrode arrays are vital in fabrication. In this paper, we would like to showcase the fabrication of high aspect ratio of vertical silicon nanowire electrodes arrays using microelectromechanical system (MEMS) method. Nanofabrication of the nanowire electrodes involves in deep reactive ion etching, thermal oxide thinning, electron-beam lithography patterning, sputtering of metal targets and bottom anti-reflection coating (BARC) etch. Metallization on the nanowire electrode tip is a prominent process to optimize the nanowire electrical conductivity and this step remains a challenge during fabrication. Metal electrodes were lithographically defined and yet these metal contacts outline a size scale that is larger than nanometer-scale building blocks hence further limiting potential advantages. Therefore, we present an integrated contact solution that overcomes this size constraint through self-aligned Nickel silicidation process on the tip of vertical silicon nanowire electrodes. A 4 x 4 array of vertical silicon nanowires electrodes with the diameter of 290nm and height of 3µm has been successfully fabricated.Keywords: brain-machine interfaces, microelectromechanical systems (MEMS), nanowire, nickel silicide
Procedia PDF Downloads 43546 Enhancement of Cross-Linguistic Effect with the Increase in the Multilingual Proficiency during Early Childhood: A Case Study of English Language Acquisition by a Pre-School Child
Authors: Anupama Purohit
Abstract:
The paper is a study on the inevitable cross-linguistic effect found in the early multilingual learners. The cross-linguistic behaviour like code-mixing, code-switching, foreign accent, literal translation, redundancy and syntactic manipulation effected due to other languages on the English language output of a non-native pre-school child are discussed here. A case study method is adopted in this paper to support the claim of the title. A simultaneously tetra lingual pre-school child’s (within 1;3 to 4;0) language behaviour is analysed here. The sample output data of the child is gathered from the diary entries maintained by her family, regular observations and video recordings done since her birth. She is getting the input of her mother tongue, Sambalpuri, from her grandparents only; Hindi, the local language from her play-school and the neighbourhood; English only from her mother and occasional visit of other family friends; Odia only during the reading of the Odia story book. The child is exposed to code-mixing of all the languages throughout her childhood. But code-mixing, literal translation, redundancy and duplication were absent in her initial stage of multilingual acquisition. As the child was more proficient in English in comparison to her other first languages and had never heard code-mixing in English language; it was expected from her input pattern of English (one parent, English language) that she would maintain purity in her use of English while talking to the English language interlocutor. But with gradual increase in the language proficiency in each of the languages of the child, her handling of the multiple codes becomes deft cross-linguistically. It can be deduced from the case study that after attaining certain milestone proficiency in each language, the child’s linguistic faculty can operate at a metalinguistic level. The functional use of each morpheme, their arrangement in words and in the sentences, the supra segmental features, lexical-semantic mapping, culture specific use of a language and the pragmatic skills converge to give a typical childlike multilingual output in an intelligible manner to the multilingual people (with the same set of languages in combination). The result is appealing because for expressing the same ideas which the child used to speak (may be with grammatically wrong expressions) in one language, gradually, she starts showing cross-linguistic effect in her expressions. So the paper pleads for the separatist view from the very beginning of the holophrastic phase (as the child expresses in addressee-specific language); but development of a metalinguistic ability that helps the child in communicating in a sophisticated way according to the linguistic status of the addressee is unique to the multilingual child. This metalinguistic ability is independent of the mode if input of a multilingual child.Keywords: code-mixing, cross-linguistic effect, early multilingualism, literal translation
Procedia PDF Downloads 30045 Breast Cancer Metastasis Detection and Localization through Transfer-Learning Convolutional Neural Network Classification Based on Convolutional Denoising Autoencoder Stack
Authors: Varun Agarwal
Abstract:
Introduction: With the advent of personalized medicine, histopathological review of whole slide images (WSIs) for cancer diagnosis presents an exceedingly time-consuming, complex task. Specifically, detecting metastatic regions in WSIs of sentinel lymph node biopsies necessitates a full-scanned, holistic evaluation of the image. Thus, digital pathology, low-level image manipulation algorithms, and machine learning provide significant advancements in improving the efficiency and accuracy of WSI analysis. Using Camelyon16 data, this paper proposes a deep learning pipeline to automate and ameliorate breast cancer metastasis localization and WSI classification. Methodology: The model broadly follows five stages -region of interest detection, WSI partitioning into image tiles, convolutional neural network (CNN) image-segment classifications, probabilistic mapping of tumor localizations, and further processing for whole WSI classification. Transfer learning is applied to the task, with the implementation of Inception-ResNetV2 - an effective CNN classifier that uses residual connections to enhance feature representation, adding convolved outputs in the inception unit to the proceeding input data. Moreover, in order to augment the performance of the transfer learning CNN, a stack of convolutional denoising autoencoders (CDAE) is applied to produce embeddings that enrich image representation. Through a saliency-detection algorithm, visual training segments are generated, which are then processed through a denoising autoencoder -primarily consisting of convolutional, leaky rectified linear unit, and batch normalization layers- and subsequently a contrast-normalization function. A spatial pyramid pooling algorithm extracts the key features from the processed image, creating a viable feature map for the CNN that minimizes spatial resolution and noise. Results and Conclusion: The simplified and effective architecture of the fine-tuned transfer learning Inception-ResNetV2 network enhanced with the CDAE stack yields state of the art performance in WSI classification and tumor localization, achieving AUC scores of 0.947 and 0.753, respectively. The convolutional feature retention and compilation with the residual connections to inception units synergized with the input denoising algorithm enable the pipeline to serve as an effective, efficient tool in the histopathological review of WSIs.Keywords: breast cancer, convolutional neural networks, metastasis mapping, whole slide images
Procedia PDF Downloads 13244 Biodegradable Self-Supporting Nanofiber Membranes Prepared by Centrifugal Spinning
Authors: Milos Beran, Josef Drahorad, Ondrej Vltavsky, Martin Fronek, Jiri Sova
Abstract:
While most nanofibers are produced using electrospinning, this technique suffers from several drawbacks, such as the requirement for specialized equipment, high electrical potential, and electrically conductive targets. Consequently, recent years have seen the increasing emergence of novel strategies in generating nanofibers in a larger scale and higher throughput manner. The centrifugal spinning is simple, cheap and highly productive technology for nanofiber production. In principle, the drawing of solution filament into nanofibers using centrifugal spinning is achieved through the controlled manipulation of centrifugal force, viscoelasticity, and mass transfer characteristics of the spinning solutions. Engineering efforts of researches of the Food research institute Prague and the Czech Technical University in the field the centrifugal nozzleless spinning led to introduction of a pilot plant demonstrator NANOCENT. The main advantages of the demonstrator are lower investment cost - thanks to simpler construction compared to widely used electrospinning equipments, higher production speed, new application possibilities and easy maintenance. The centrifugal nozzleless spinning is especially suitable to produce submicron fibers from polymeric solutions in highly volatile solvents, such as chloroform, DCM, THF, or acetone. To date, submicron fibers have been prepared from PS, PUR and biodegradable polyesters, such as PHB, PLA, PCL, or PBS. The products are in form of 3D structures or nanofiber membranes. Unique self-supporting nanofiber membranes were prepared from the biodegradable polyesters in different mixtures. The nanofiber membranes have been tested for different applications. Filtration efficiencies for water solutions and aerosols in air were evaluated. Different active inserts were added to the solutions before the spinning process, such as inorganic nanoparticles, organic precursors of metal oxides, antimicrobial and wound healing compounds or photocatalytic phthalocyanines. Sintering can be subsequently carried out to remove the polymeric material and transfer the organic precursors to metal oxides, such as Si02, or photocatalytic Zn02 and Ti02, to obtain inorganic nanofibers. Electrospinning is more suitable technology to produce membranes for the filtration applications than the centrifugal nozzleless spinning, because of the formation of more homogenous nanofiber layers and fibers with smaller diameters. The self-supporting nanofiber membranes prepared from the biodegradable polyesters are especially suitable for medical applications, such as wound or burn healing dressings or tissue engineering scaffolds. This work was supported by the research grants TH03020466 of the Technology Agency of the Czech Republic.Keywords: polymeric nanofibers, self-supporting nanofiber membranes, biodegradable polyesters, active inserts
Procedia PDF Downloads 16643 Diminishing Constitutional Hyper-Rigidity by Means of Digital Technologies: A Case Study on E-Consultations in Canada
Authors: Amy Buckley
Abstract:
The purpose of this article is to assess the problem of constitutional hyper-rigidity to consider how it and the associated tensions with democratic constitutionalism can be diminished by means of using digital democratic technologies. In other words, this article examines how digital technologies can assist us in ensuring fidelity to the will of the constituent power without paying the price of hyper-rigidity. In doing so, it is impossible to ignore that digital strategies can also harm democracy through, for example, manipulation, hacking, ‘fake news,’ and the like. This article considers the tension between constitutional hyper-rigidity and democratic constitutionalism and the relevant strengths and weaknesses of digital democratic strategies before undertaking a case study on Canadian e-consultations and drawing its conclusions. This article observes democratic constitutionalism through the lens of the theory of deliberative democracy to suggest that the application of digital strategies can, notwithstanding their pitfalls, improve a constituency’s amendment culture and, thus, diminish constitutional hyper-rigidity. Constitutional hyper-rigidity is not a new or underexplored concept. At a high level, a constitution can be said to be ‘hyper-rigid’ when its formal amendment procedure is so difficult to enact that it does not take place or is limited in its application. This article claims that hyper-rigidity is one problem with ordinary constitutionalism that fails to satisfy the principled requirements of democratic constitutionalism. Given the rise and development of technology that has taken place since the Digital Revolution, there has been a significant expansion in the possibility for digital democratic strategies to overcome the democratic constitutionalism failures resulting from constitutional hyper-rigidity. Typically, these strategies have included, inter alia, e- consultations, e-voting systems, and online polling forums, all of which significantly improve the ability of politicians and judges to directly obtain the opinion of constituents on any number of matters. This article expands on the application of these strategies through its Canadian e-consultation case study and presents them as a solution to poor amendment culture and, consequently, constitutional hyper-rigidity. Hyper-rigidity is a common descriptor of many written and unwritten constitutions, including the United States, Australian, and Canadian constitutions as just some examples. This article undertakes a case study on Canada, in particular, as it is a jurisdiction less commonly cited in academic literature generally concerned with hyper-rigidity and because Canada has to some extent, championed the use of e-consultations. In Part I of this article, I identify the problem, being that the consequence of constitutional hyper-rigidity is in tension with the principles of democratic constitutionalism. In Part II, I identify and explore a potential solution, the implementation of digital democratic strategies as a means of reducing constitutional hyper-rigidity. In Part III, I explore Canada’s e-consultations as a case study for assessing whether digital democratic strategies do, in fact, improve a constituency’s amendment culture thus reducing constitutional hyper-rigidity and the associated tension that arises with the principles of democratic constitutionalism. The idea is to run a case study and then assess whether I can generalise the conclusions.Keywords: constitutional hyper-rigidity, digital democracy, deliberative democracy, democratic constitutionalism
Procedia PDF Downloads 7942 Advertising Disability Index: A Content Analysis of Disability in Television Commercial Advertising from 2018
Authors: Joshua Loebner
Abstract:
Tectonic shifts within the advertising industry regularly and repeatedly present a deluge of data to be intuited across a spectrum of key performance indicators with innumerable interpretations where live campaigns are vivisected to pivot towards coalescence amongst a digital diaspora. But within this amalgam of analytics, validation, and creative campaign manipulation, where do diversity and disability inclusion fit in? In 2018 several major brands were able to answer this question definitely and directly by incorporating people with disabilities into advertisements. Disability inclusion, representation, and portrayals are documented annually across a number of different media, from film to primetime television, but ongoing studies centering on advertising have not been conducted. Symbols and semiotics in advertising often focus on a brand’s features and benefits, but this analysis on advertising and disability shows, how in 2018, creative campaigns and the disability community came together with the goal to continue the momentum and spark conversations. More brands are welcoming inclusion and sharing positive portrayals of intersectional diversity and disability. Within the analysis and surrounding scholarship, a multipoint analysis of each advertisement and meta-interpretation of the research has been conducted to provide data, clarity, and contextualization of insights. This research presents an advertising disability index that can be monitored for trends and shifts in future studies and to provide further comparisons and contrasts of advertisements. An overview of the increasing buying power within the disability community and population changes among this group anchors the significance and size of the minority in the US. When possible, viewpoints from creative teams and advertisers that developed the ads are brought into the research to further establish understanding, meaning, and individuals’ purposeful approaches towards disability inclusion. Finally, the conclusion and discussion present key takeaways to learn from the research, build advocacy and action both within advertising scholarship and the profession. This study, developed into an advertising disability index, will answer questions of how people with disabilities are represented in each ad. In advertising that includes disability, there is a creative pendulum. At one extreme, among many other negative interpretations, people with disables are portrayed in a way that conveys pity, fosters ableism and discrimination, and shows that people with disabilities are less than normal from a societal and cultural perspective. At the other extreme, people with disabilities are portrayed with a type of undue inspiration, considered inspiration porn, or superhuman, otherwise known as supercrip, and in ways that most people with disabilities could never achieve, or don’t want to be seen for. While some ads reflect both extremes, others stood out for non-polarizing inclusion of people with disabilities. This content analysis explores television commercial advertisements to determine the presence of people with disabilities and any other associated disability themes and/or concepts. Content analysis will allow for measuring the presence and interpretation of disability portrayals in each ad.Keywords: advertising, brand, disability, marketing
Procedia PDF Downloads 12041 Working Memory and Phonological Short-Term Memory in the Acquisition of Academic Formulaic Language
Authors: Zhicheng Han
Abstract:
This study examines the correlation between knowledge of formulaic language, working memory (WM), and phonological short-term memory (PSTM) in Chinese L2 learners of English. This study investigates if WM and PSTM correlate differently to the acquisition of formulaic language, which may be relevant for the discourse around the conceptualization of formulas. Connectionist approaches have lead scholars to argue that formulas are form-meaning connections stored whole, making PSTM significant in the acquisitional process as it pertains to the storage and retrieval of chunk information. Generativist scholars, on the other hand, argued for active participation of interlanguage grammar in the acquisition and use of formulaic language, where formulas are represented in the mind but retain the internal structure built around a lexical core. This would make WM, especially the processing component of WM an important cognitive factor since it plays a role in processing and holding information for further analysis and manipulation. The current study asked L1 Chinese learners of English enrolled in graduate programs in China to complete a preference raking task where they rank their preference for formulas, grammatical non-formulaic expressions, and ungrammatical phrases with and without the lexical core in academic contexts. Participants were asked to rank the options in order of the likeliness of them encountering these phrases in the test sentences within academic contexts. Participants’ syntactic proficiency is controlled with a cloze test and grammar test. Regression analysis found a significant relationship between the processing component of WM and preference of formulaic expressions in the preference ranking task while no significant correlation is found for PSTM or syntactic proficiency. The correlational analysis found that WM, PSTM, and the two proficiency test scores have significant covariates. However, WM and PSTM have different predictor values for participants’ preference for formulaic language. Both storage and processing components of WM are significantly correlated with the preference for formulaic expressions while PSTM is not. These findings are in favor of the role of interlanguage grammar and syntactic knowledge in the acquisition of formulaic expressions. The differing effects of WM and PSTM suggest that selective attention to and processing of the input beyond simple retention play a key role in successfully acquiring formulaic language. Similar correlational patterns were found for preferring the ungrammatical phrase with the lexical core of the formula over the ones without the lexical core, attesting to learners’ awareness of the lexical core around which formulas are constructed. These findings support the view that formulaic phrases retain internal syntactic structures that are recognized and processed by the learners.Keywords: formulaic language, working memory, phonological short-term memory, academic language
Procedia PDF Downloads 6440 The Confluence between Autism Spectrum Disorder and the Schizoid Personality
Authors: Murray David Schane
Abstract:
Though years of clinical encounters with patients with autism spectrum disorders and those with a schizoid personality the many defining diagnostic features shared between these conditions have been explored and current neurobiological differences have been reviewed; and, critical and different treatment strategies for each have been devised. The paper compares and contrasts the apparent similarities between autism spectrum disorders and the schizoid personality are found in these DSM descriptive categories: restricted range of social-emotional reciprocity; poor non-verbal communicative behavior in social interactions; difficulty developing and maintaining relationships; detachment from social relationships; lack of the desire for or enjoyment of close relationships; and preference for solitary activities. In this paper autism, fundamentally a communicative disorder, is revealed to present clinically as a pervasive aversive response to efforts to engage with or be engaged by others. Autists with the Asperger presentation typically have language but have difficulty understanding humor, irony, sarcasm, metaphoric speech, and even narratives about social relationships. They also tend to seek sameness, possibly to avoid problems of social interpretation. Repetitive behaviors engage many autists as a screen against ambient noise, social activity, and challenging interactions. Also in this paper, the schizoid personality is revealed as a pattern of social avoidance, self-sufficiency and apparent indifference to others as a complex psychological defense against a deep, long-abiding fear of appropriation and perverse manipulation. Neither genetic nor MRI studies have yet located the explanatory data that identifies the cause or the neurobiology of autism. Similarly, studies of the schizoid have yet to group that condition with those found in schizophrenia. Through presentations of clinical examples, the treatment of autists of the Asperger type is revealed to address the autist’s extreme social aversion which also precludes the experience of empathy. Autists will be revealed as forming social attachments but without the capacity to interact with mutual concern. Empathy will be shown be teachable and, as social avoidance relents, understanding of the meaning and signs of empathic needs that autists can recognize and acknowledge. Treatment of schizoids will be shown to revolve around joining empathically with the schizoid’s apprehensions about interpersonal, interactive proximity. Models of both autism and schizoid personality traits have yet to be replicated in animals, thereby eliminating the role of translational research in providing the kind of clues to behavioral patterns that can be related to genetic, epigenetic and neurobiological measures. But as these clinical examples will attest, treatment strategies have significant impact.Keywords: autism spectrum, schizoid personality traits, neurobiological implications, critical diagnostic distinctions
Procedia PDF Downloads 11539 Awake Fiberoptic Intubation for Airway Management in a Patient with an Ulceroproliferative Mass of the Aryepiglottic Fold Obscuring Glottic Opening
Authors: Dielle Martins
Abstract:
A 45-year-old female, Manju Devi, presented with a 6-month history of progressively changing voice, difficulty breathing for the past month, and worsening dysphagia for the past two weeks, particularly with solids. Direct laryngoscopy revealed an ulceroproliferative mass arising from the left aryepiglottic fold, obscuring the glottic opening. Imaging with contrast-enhanced CT of the neck showed a lobulated, heterogeneous mass in the hypo-pharyngeal region, encroaching into the airway and involving the aryepiglottic fold and pyriform sinus, raising concerns for a malignant lesion. Small reactive lymph nodes were identified in the left submandibular region and along the carotid sheath. Due to the location of the mass near the glottis and the risk of complete airway obstruction, securing the airway was a critical concern. An awake fiberoptic bronchoscopy for endotracheal intubation was chosen as the safest approach. The patient was prepped with local anesthesia to the airway using nebulized 10% lignocaine and 4% lignocaine spray to the oral mucosa. After obtaining informed consent, the patient was positioned supine on the operating table. To facilitate the fiberoptic intubation, the patient’s neck was extended, and the head was laterally rotated 30 degrees to the left. This positioning helped optimize the visualization of the glottic opening, which was obscured by the mass. The fiberoptic scope was carefully passed through the oral cavity, past the uvula, and into the laryngeal area. As the scope advanced, the ulceroproliferative mass was observed covering most of the glottis, with only the anterior commissure visible. After further gentle manipulation, including the use of a shoulder roll for additional neck extension and rotation, a clearer view of the anterior two-thirds of the glottis was achieved. A 6.5mm internal diameter endotracheal tube was advanced over the fiberoptic scope and successfully positioned just above the carina. General anesthesia was then induced, and an excision biopsy of the growth was performed. This case underscores the importance of careful preoperative airway evaluation and the role of awake fiberoptic intubation in managing complex airway obstructions. Proper patient positioning, including neck extension and lateral rotation, proved crucial for successful intubation in the presence of a mass obstructing the glottic opening. This case emphasizes the techniques used in the fiberoptic intubation and the careful positioning of the patient, which were critical for the success of the procedure.Keywords: awake fiberoptic bronchoscopy in laryngeal growth, Difficult intubation in glottic cancer, glottic cancer, difficult airway
Procedia PDF Downloads 838 Public Participation in Political Transformation: From the Coup D’etat in 2014 to the Events Leading up to the Proposed Election in 2018 in Thailand
Authors: Pataramon Satalak, Sakrit Isariyanon, Teerapong Puripanik
Abstract:
This article uses the recent events in Thailand as a case study for examining why democratic transition is necessary during political upheaval to ensure that the people’s power remains unaffected. After seizing power in May 2014, the military, backed by anti-government protestors, selected and established their own system to govern the country. They set up the National Council for Peace and Order (NCPO) which established a People’s Assembly, aiming to reach a compromise between the conflicting opinions of former, pro-government and anti-government protesters. It plans to achieve this through political reform before returning sovereign power to the people via an election in 2018. If a governmental authority is not representative of the people (e.g. a military government) it does not count as a legitimate government. During the last four years of military government, from May 2014 to January 2018, their rule of Thailand has been widely controversial, specifically regarding their commitment to democracy, human rights violations and their manipulation of the rule of law. Democratic legitimacy relies not only on established mechanisms for public participation (like referendums or elections) but also public participation based on accessible and educational reform (often via NGOs) to ensure that the free and fair will of the people can be expressed. Through their actions over the last three years, the Thai military government has damaged both of these components, impacting future public participation in politics. The authors make some observations about the specific actions the military government has taken to erode the democratic legitimacy of future public participation: the increasing dominance of military courts over civil courts; civil society’s limited involvement in political activities; the drafting of a new constitution and their attempt to master support through referenda and its consequence for delaying organic law-making process; the structure of the legislative powers (Senate and the members of parliament); and the control of people’s basic freedoms of expression, movement and assembly in political activities. One clear consequence of the military government’s specific actions over the last three years is the increased uncertainty amongst Thai people that their fundamental freedoms and political rights will be respected in the future. This will directly affect their participation in future democratic processes. The military government’s actions (e.g. their response to the UN representatives) will also have influenced potential international engagement in Thai civil society to help educate disadvantaged people about their rights, and their participation in the political arena. These actions challenge the democratic idea that there should be a checking and balancing of power between people and government. These examples provide evidence that a democratic transition is crucial during any process of political transformation.Keywords: political tranformation, public participation, Thailand coup d'etat 2014, election 2018
Procedia PDF Downloads 14937 Effect of Vitrification on Embryos Euploidy Obtained from Thawed Oocytes
Authors: Natalia Buderatskaya, Igor Ilyin, Julia Gontar, Sergey Lavrynenko, Olga Parnitskaya, Ekaterina Ilyina, Eduard Kapustin, Yana Lakhno
Abstract:
Introduction: It is known that cryopreservation of oocytes has peculiar features due to the complex structure of the oocyte. One of the most important features is that mature oocytes contain meiotic division spindle which is very sensitive even to the slightest variation in temperature. Thus, the main objective of this study is to analyse the resulting euploid embryos obtained from thawed oocytes in comparison with the data of preimplantation genetic screening (PGS) in fresh embryo cycles. Material and Methods: The study was conducted at 'Medical Centre IGR' from January to July 2016. Data were analysed for 908 donor oocytes obtained in 67 cycles of assisted reproductive technologies (ART), of which 693 oocytes were used in the 51 'fresh' cycles (group A), and 215 oocytes - 16 ART programs with vitrification female gametes (group B). The average age of donors in the groups match 27.3±2.9 and 27.8±6.6 years. Stimulation of superovulation was conducted the standard way. Vitrification was performed in 1-2 hours after transvaginal puncture and thawing of oocytes were carried out in accordance with the standard protocol of Cryotech (Japan). Manipulation ICSI was performed 4-5 hours after transvaginal follicle puncture for fresh oocytes, or after defrosting - for vitrified female gametes. For the PGS, an embryonic biopsy was done on the third or on the fifth day after fertilization. Diagnostic procedures were performed using fluorescence in situ hybridization with the study of such chromosomes as 13, 16, 18, 21, 22, X, Y. Only morphologically quality blastocysts were used for the transfer, the estimation of which corresponded to the Gardner criteria. The statistical hypotheses were done using the criteria t, x^2 at a significance levels p<0.05, p<0.01, p<0.001. Results: The mean number of mature oocytes per cycle in group A was 13.58±6.65 and in group B - 13.44±6.68 oocytes for patient. The survival of oocytes after thawing totaled 95.3% (n=205), which indicates a highly effective quality of performed vitrification. The proportion of zygotes in the group A corresponded to 91.1%(n=631), in the group B – 80.5%(n=165), which shows statistically significant difference between the groups (p<0.001) and explained by non-viable oocytes elimination after vitrification. This is confirmed by the fact that on the fifth day of embryos development a statistically significant difference in the number of blastocysts was absent (p>0.05), and constituted respectively 61.6%(n=389) and 63.0%(n=104) in the groups. For the PGS performing 250 embryos analyzed in the group A and 72 embryos - in the group B. The results showed that euploidy in the studied chromosomes were 40.0%(n=100) embryos in the group A and 41.7% (n=30) - in the group B, which shows no statistical significant difference (p>0.05). The indicators of clinical pregnancies in the groups amounted to 64.7% (22 pregnancies per 34 embryo transfers) and 61.5% (8 pregnancies per 13 embryo transfers) respectively, and also had no significant difference between the groups (p>0.05). Conclusions: The results showed that the vitrification does not affect the resulting euploid embryos in assisted reproductive technologies and are not reflected in their morphological characteristics in ART programs.Keywords: euploid embryos, preimplantation genetic screening, thawing oocytes, vitrification
Procedia PDF Downloads 33436 Modeling the Relation between Discretionary Accrual Earnings Management, International Financial Reporting Standards and Corporate Governance
Authors: Ikechukwu Ndu
Abstract:
This study examines the econometric modeling of the relation between discretionary accrual earnings management, International Financial Reporting Standards (IFRS), and certain corporate governance factors with regard to listed Nigerian non-financial firms. Although discretionary accrual earnings management is a well-known and global problem that has an adverse impact on users of the financial statements, its relationship with IFRS and corporate governance is neither adequately researched nor properly systematically investigated in Nigeria. The dearth of research in the relation between discretionary accrual earnings management, IFRS and corporate governance in Nigeria has made it difficult for academics, practitioners, government setting bodies, regulators and international bodies to achieve a clearer understanding of how discretionary accrual earnings management relates to IFRS and certain corporate governance characteristics. This is the first study to the author’s best knowledge to date that makes interesting research contributions that significantly add to the literature of discretionary accrual earnings management and its relation with corporate governance and IFRS pertaining to the Nigerian context. A comprehensive review is undertaken of the literature of discretionary total accrual earnings management, IFRS, and certain corporate governance characteristics as well as the data, models, methodologies, and different estimators used in the study. Secondary financial statement, IFRS, and corporate governance data are sourced from Bloomberg database and published financial statements of Nigerian non-financial firms for the period 2004 to 2016. The methodology uses both the total and working capital accrual basis. This study has a number of interesting preliminary findings. First, there is a negative relationship between the level of discretionary accrual earnings management and the adoption of IFRS. However, this relationship does not appear to be statistically significant. Second, there is a significant negative relationship between the size of the board of directors and discretionary accrual earnings management. Third, CEO Separation of roles does not constrain earnings management, indicating the need to preserve relationships, personal connections, and maintain bonded friendships between the CEO, Chairman, and executive directors. Fourth, there is a significant negative relationship between discretionary accrual earnings management and the use of a Big Four firm as an auditor. Fifth, including shareholders in the audit committee, leads to a reduction in discretionary accrual earnings management. Sixth, the debt and return on assets (ROA) variables are significant and positively related to discretionary accrual earnings management. Finally, the company size variable indicated by the log of assets is surprisingly not found to be statistically significant and indicates that all Nigerian companies irrespective of size engage in discretionary accrual management. In conclusion, this study provides key insights that enable a better understanding of the relationship between discretionary accrual earnings management, IFRS, and corporate governance in the Nigerian context. It is expected that the results of this study will be of interest to academics, practitioners, regulators, governments, international bodies and other parties involved in policy setting and economic development in areas of financial reporting, securities regulation, accounting harmonization, and corporate governance.Keywords: discretionary accrual earnings management, earnings manipulation, IFRS, corporate governance
Procedia PDF Downloads 14635 Preliminary Design, Production and Characterization of a Coral and Alginate Composite for Bone Engineering
Authors: Sthephanie A. Colmenares, Fabio A. Rojas, Pablo A. Arbeláez, Johann F. Osma, Diana Narvaez
Abstract:
The loss of functional tissue is a ubiquitous and expensive health care problem, with very limited treatment options for these patients. The golden standard for large bone damage is a cadaveric bone as an allograft with stainless steel support; however, this solution only applies to bones with simple morphologies (long bones), has a limited material supply and presents long term problems regarding mechanical strength, integration, differentiation and induction of native bone tissue. Therefore, the fabrication of a scaffold with biological, physical and chemical properties similar to the human bone with a fabrication method for morphology manipulation is the focus of this investigation. Towards this goal, an alginate and coral matrix was created using two production techniques; the coral was chosen because of its chemical composition and the alginate due to its compatibility and mechanical properties. In order to construct the coral alginate scaffold the following methodology was employed; cleaning of the coral, its pulverization, scaffold fabrication and finally the mechanical and biological characterization. The experimental design had: mill method and proportion of alginate and coral, as the two factors, with two and three levels each, using 5 replicates. The coral was cleaned with sodium hypochlorite and hydrogen peroxide in an ultrasonic bath. Then, it was milled with both a horizontal and a ball mill in order to evaluate the morphology of the particles obtained. After this, using a combination of alginate and coral powder and water as a binder, scaffolds of 1cm3 were printed with a SpectrumTM Z510 3D printer. This resulted in solid cubes that were resistant to small compression stress. Then, using a ESQUIM DP-143 silicon mold, constructs used for the mechanical and biological assays were made. An INSTRON 2267® was implemented for the compression tests; the density and porosity were calculated with an analytical balance and the biological tests were performed using cell cultures with VERO fibroblast, and Scanning Electron Microscope (SEM) as visualization tool. The Young’s moduli were dependent of the pulverization method, the proportion of coral and alginate and the interaction between these factors. The maximum value was 5,4MPa for the 50/50 proportion of alginate and horizontally milled coral. The biological assay showed more extracellular matrix in the scaffolds consisting of more alginate and less coral. The density and porosity were proportional to the amount of coral in the powder mix. These results showed that this composite has potential as a biomaterial, but its behavior is elastic with a small Young’s Modulus, which leads to the conclusion that the application may not be for long bones but for tissues similar to cartilage.Keywords: alginate, biomaterial, bone engineering, coral, Porites asteroids, SEM
Procedia PDF Downloads 25534 Electrohydrodynamic Patterning for Surface Enhanced Raman Scattering for Point-of-Care Diagnostics
Authors: J. J. Rickard, A. Belli, P. Goldberg Oppenheimer
Abstract:
Medical diagnostics, environmental monitoring, homeland security and forensics increasingly demand specific and field-deployable analytical technologies for quick point-of-care diagnostics. Although technological advancements have made optical methods well-suited for miniaturization, a highly-sensitive detection technique for minute sample volumes is required. Raman spectroscopy is a well-known analytical tool, but has very weak signals and hence is unsuitable for trace level analysis. Enhancement via localized optical fields (surface plasmons resonances) on nanoscale metallic materials generates huge signals in surface-enhanced Raman scattering (SERS), enabling single molecule detection. This enhancement can be tuned by manipulation of the surface roughness and architecture at the sub-micron level. Nevertheless, the development and application of SERS has been inhibited by the irreproducibility and complexity of fabrication routes. The ability to generate straightforward, cost-effective, multiplex-able and addressable SERS substrates with high enhancements is of profound interest for SERS-based sensing devices. While most SERS substrates are manufactured by conventional lithographic methods, the development of a cost-effective approach to create nanostructured surfaces is a much sought-after goal in the SERS community. Here, a method is established to create controlled, self-organized, hierarchical nanostructures using electrohydrodynamic (HEHD) instabilities. The created structures are readily fine-tuned, which is an important requirement for optimizing SERS to obtain the highest enhancements. HEHD pattern formation enables the fabrication of multiscale 3D structured arrays as SERS-active platforms. Importantly, each of the HEHD-patterned individual structural units yield a considerable SERS enhancement. This enables each single unit to function as an isolated sensor. Each of the formed structures can be effectively tuned and tailored to provide high SERS enhancement, while arising from different HEHD morphologies. The HEHD fabrication of sub-micrometer architectures is straightforward and robust, providing an elegant route for high-throughput biological and chemical sensing. The superior detection properties and the ability to fabricate SERS substrates on the miniaturized scale, will facilitate the development of advanced and novel opto-fluidic devices, such as portable detection systems, and will offer numerous applications in biomedical diagnostics, forensics, ecological warfare and homeland security.Keywords: hierarchical electrohydrodynamic patterning, medical diagnostics, point-of care devices, SERS
Procedia PDF Downloads 34733 Reliable and Error-Free Transmission through Multimode Polymer Optical Fibers in House Networks
Authors: Tariq Ahamad, Mohammed S. Al-Kahtani, Taisir Eldos
Abstract:
Optical communications technology has made enormous and steady progress for several decades, providing the key resource in our increasingly information-driven society and economy. Much of this progress has been in finding innovative ways to increase the data carrying capacity of a single optical fiber. In this research article we have explored basic issues in terms of security and reliability for secure and reliable information transfer through the fiber infrastructure. Conspicuously, one potentially enormous source of improvement has however been left untapped in these systems: fibers can easily support hundreds of spatial modes, but today’s commercial systems (single-mode or multi-mode) make no attempt to use these as parallel channels for independent signals. Bandwidth, performance, reliability, cost efficiency, resiliency, redundancy, and security are some of the demands placed on telecommunications today. Since its initial development, fiber optic systems have had the advantage of most of these requirements over copper-based and wireless telecommunications solutions. The largest obstacle preventing most businesses from implementing fiber optic systems was cost. With the recent advancements in fiber optic technology and the ever-growing demand for more bandwidth, the cost of installing and maintaining fiber optic systems has been reduced dramatically. With so many advantages, including cost efficiency, there will continue to be an increase of fiber optic systems replacing copper-based communications. This will also lead to an increase in the expertise and the technology needed to tap into fiber optic networks by intruders. As ever before, all technologies have been subject to hacking and criminal manipulation, fiber optics is no exception. Researching fiber optic security vulnerabilities suggests that not everyone who is responsible for their networks security is aware of the different methods that intruders use to hack virtually undetected into fiber optic cables. With millions of miles of fiber optic cables stretching across the globe and carrying information including but certainly not limited to government, military, and personal information, such as, medical records, banking information, driving records, and credit card information; being aware of fiber optic security vulnerabilities is essential and critical. Many articles and research still suggest that fiber optics is expensive, impractical and hard to tap. Others argue that it is not only easily done, but also inexpensive. This paper will briefly discuss the history of fiber optics, explain the basics of fiber optic technologies and then discuss the vulnerabilities in fiber optic systems and how they can be better protected. Knowing the security risks and knowing the options available may save a company a lot embarrassment, time, and most importantly money.Keywords: in-house networks, fiber optics, security risk, money
Procedia PDF Downloads 42332 Estimating Estimators: An Empirical Comparison of Non-Invasive Analysis Methods
Authors: Yan Torres, Fernanda Simoes, Francisco Petrucci-Fonseca, Freddie-Jeanne Richard
Abstract:
The non-invasive samples are an alternative of collecting genetic samples directly. Non-invasive samples are collected without the manipulation of the animal (e.g., scats, feathers and hairs). Nevertheless, the use of non-invasive samples has some limitations. The main issue is degraded DNA, leading to poorer extraction efficiency and genotyping. Those errors delayed for some years a widespread use of non-invasive genetic information. Possibilities to limit genotyping errors can be done using analysis methods that can assimilate the errors and singularities of non-invasive samples. Genotype matching and population estimation algorithms can be highlighted as important analysis tools that have been adapted to deal with those errors. Although, this recent development of analysis methods there is still a lack of empirical performance comparison of them. A comparison of methods with dataset different in size and structure can be useful for future studies since non-invasive samples are a powerful tool for getting information specially for endangered and rare populations. To compare the analysis methods, four different datasets used were obtained from the Dryad digital repository were used. Three different matching algorithms (Cervus, Colony and Error Tolerant Likelihood Matching - ETLM) are used for matching genotypes and two different ones for population estimation (Capwire and BayesN). The three matching algorithms showed different patterns of results. The ETLM produced less number of unique individuals and recaptures. A similarity in the matched genotypes between Colony and Cervus was observed. That is not a surprise since the similarity between those methods on the likelihood pairwise and clustering algorithms. The matching of ETLM showed almost no similarity with the genotypes that were matched with the other methods. The different cluster algorithm system and error model of ETLM seems to lead to a more criterious selection, although the processing time and interface friendly of ETLM were the worst between the compared methods. The population estimators performed differently regarding the datasets. There was a consensus between the different estimators only for the one dataset. The BayesN showed higher and lower estimations when compared with Capwire. The BayesN does not consider the total number of recaptures like Capwire only the recapture events. So, this makes the estimator sensitive to data heterogeneity. Heterogeneity in the sense means different capture rates between individuals. In those examples, the tolerance for homogeneity seems to be crucial for BayesN work properly. Both methods are user-friendly and have reasonable processing time. An amplified analysis with simulated genotype data can clarify the sensibility of the algorithms. The present comparison of the matching methods indicates that Colony seems to be more appropriated for general use considering a time/interface/robustness balance. The heterogeneity of the recaptures affected strongly the BayesN estimations, leading to over and underestimations population numbers. Capwire is then advisable to general use since it performs better in a wide range of situations.Keywords: algorithms, genetics, matching, population
Procedia PDF Downloads 14531 Journey to Inclusive School: Description of Crucial Sensitive Concepts in the Context of Situational Analysis
Authors: Denisa Denglerova, Radim Sip
Abstract:
Academic sources as well as international agreements and national documents define inclusion in terms of several criteria: equal opportunities, fulfilling individual needs, development of human resources, community participation. In order for these criteria to be met, the community must be cohesive. Community cohesion, which is a relatively new concept, is not determined by homogeneity, but by the acceptance of diversity among the community members and utilisation of its positive potential. This brings us to a central category of inclusion - appreciating diversity and using it to a positive effect. However, school diversity is a real phenomenon, which schools need to tackle more and more often. This is also indicated by the number of publications focused on diversity in schools. These sources present recent analyses of using identity as a tool of coping with the demands of a diversified society. The aim of this study is to identify and describe in detail the processes taking place in selected schools, which contribute to their pro-inclusive character. The research is designed around a multiple case study of three pro-inclusive schools. Paradigmatically speaking, the research is rooted in situational epistemology. This is also related to the overall framework of interpretation, for which we are going to use innovative methods of situational analysis. In terms of specific research outcomes this will manifest itself in replacing the idea of “objective theory” by the idea of “detailed cartography of a social world”. The cartographic approach directs both the logic of data collection and the choice of methods of their analysis and interpretation. The research results include detection of the following sensitive concepts: Key persons. All participants can contribute to promoting an inclusion-friendly environment; however, some do so with greater motivation than others. These could include school management, teachers with a strong vision of equality, or school counsellors. They have a significant effect on the transformation of the school, and are themselves deeply convinced that inclusion is necessary. Accordingly, they select suitable co-workers; they also inspire some of the other co-workers to make changes, leading by example. Employees with strongly opposing views gradually leave the school, and new members of staff are introduced to the concept of inclusion and openness from the beginning. Manifestations of school openness in working with diversity on all important levels. By this we mean positive manipulation with diversity both in the relationships between “traditional” school participants (directors, teachers, pupils) and school-parent relationships, or relationships between schools and the broader community, in terms of teaching methods as well as ways how the school culture affects the school environment. Other important detected concepts significantly helping to form a pro-inclusive environment in the school are individual and parallel classes; freedom and responsibility of both pupils and teachers, manifested on the didactic level by tendencies towards an open curriculum; ways of asserting discipline in the school environment.Keywords: inclusion, diversity, education, sensitive concept, situational analysis
Procedia PDF Downloads 20030 History of Russian Women: The Historical Overview of the Images and Roles of Women in Old and Modern Russia
Authors: Elena Chernyak
Abstract:
The status of Russian women has changed dramatically over the course of Russian history and under different leadership and economic, political, and social conditions. The perception of women, their submissive roles, and low social status cause gender conflict that affects society: demographical issues, increased numbers of divorces, alcoholism, drug abuse, and crime. Despite the fact that around the world women are becoming more independent, protected by law, and play more important roles in society, Russian women are still dependent on men financially, socially, and psychologically. This paper critically explores the experience of Russian women over the course of over a thousand year of Russian history and how the position and image of women changed in Russian Empire, Soviet and post-Soviet Russia and what role women play in contemporary Russia. This paper is a result of deep examination of historical and religious literature, mass media, internet sources, and documents. This analysis shows that throughout history, the role and image of women in society have repeatedly varied depending on ideological and social conditions. In particular, the history of Russian women may be divided into five main periods. The first was the period of paganism, when almost all areas of life were open for women and when women were almost equal in social roles with men. During the second period, starting with the beginning of the Mongol invasion in the 13th century, the position of women was diminishing due to social transformation to the patriarchal society in which women started playing subordinate role in family and society. The third period – the period from the fourteenth through the sixteenth centuries - is a period of the total seclusion of Russian women from each part of social life. The fourth, Soviet period started after the Revolution of 1917. During that time, the position of women was drastically changed due to the transformation of traditional gender roles under the Bolshevik government. Woman's role was seen as worker-mothers who had a double duty: a worker and a mother. The final period began after the collapse of the Soviet Union. The restructuring (Perestroika) and post-Restructuring periods have had contradictory consequences and tremendous impact on Russian society. The image of women as partners and equal to men, which was promoted during the Soviet regime, has been replaced with the traditional functionalist views on family and the role of women, in which men and women have different but supposedly complementary roles. Modern Russia, despite publicly stating its commitment to equal rights, during last two decades has been reverting to an older social model with its emphasis on traditional gender roles, patriarchal ideas of dominant masculinity, and adverse attitudes to women, which are further supported and reinforced by the reviving Russian Orthodox Church. As demonstrated in this review, Russian women have never possessed the same rights as men and have always been subordinate to men. During all period of Russian history, patriarchal ideology maintained and reinforced in Russian society has always subjected women to manipulation, oppression, and victimization and portrayed women as not a ‘full human being’.Keywords: women, Russia, patriarchy, religion, Russian Orthodox Church
Procedia PDF Downloads 16929 Graphene Metamaterials Supported Tunable Terahertz Fano Resonance
Authors: Xiaoyong He
Abstract:
The manipulation of THz waves is still a challenging task due to lack of natural materials interacted with it strongly. Designed by tailoring the characters of unit cells (meta-molecules), the advance of metamaterials (MMs) may solve this problem. However, because of Ohmic and radiation losses, the performance of MMs devices is subjected to the dissipation and low quality factor (Q-factor). This dilemma may be circumvented by Fano resonance, which arises from the destructive interference between a bright continuum mode and dark discrete mode (or a narrow resonance). Different from symmetric Lorentz spectral curve, Fano resonance indicates a distinct asymmetric line-shape, ultrahigh quality factor, steep variations in spectrum curves. Fano resonance is usually realized through symmetry breaking. However, if concentric double rings (DR) are placed closely to each other, the near-field coupling between them gives rise to two hybridized modes (bright and narrowband dark modes) because of the local asymmetry, resulting into the characteristic Fano line shape. Furthermore, from the practical viewpoint, it is highly desirable requirement that to achieve the modulation of Fano spectral curves conveniently, which is an important and interesting research topics. For current Fano systems, the tunable spectral curves can be realized by adjusting the geometrical structural parameters or magnetic fields biased the ferrite-based structure. But due to limited dispersion properties of active materials, it is still a tough work to tailor Fano resonance conveniently with the fixed structural parameters. With the favorable properties of extreme confinement and high tunability, graphene is a strong candidate to achieve this goal. The DR-structure possesses the excitation of so-called “trapped modes,” with the merits of simple structure and high quality of resonances in thin structures. By depositing graphene circular DR on the SiO2/Si/ polymer substrate, the tunable Fano resonance has been theoretically investigated in the terahertz regime, including the effects of graphene Fermi level, structural parameters and operation frequency. The results manifest that the obvious Fano peak can be efficiently modulated because of the strong coupling between incident waves and graphene ribbons. As Fermi level increases, the peak amplitude of Fano curve increases, and the resonant peak position shifts to high frequency. The amplitude modulation depth of Fano curves is about 30% if Fermi level changes in the scope of 0.1-1.0 eV. The optimum gap distance between DR is about 8-12 μm, where the value of figure of merit shows a peak. As the graphene ribbon width increases, the Fano spectral curves become broad, and the resonant peak denotes blue shift. The results are very helpful to develop novel graphene plasmonic devices, e.g. sensors and modulators.Keywords: graphene, metamaterials, terahertz, tunable
Procedia PDF Downloads 34628 Delusional Parasitosis (A Rare Primary Psychiatric Diagnosis)
Authors: Jaspinder Kaur, Jatinder Pal Singh
Abstract:
Introduction- Delusional parasitosis is a rare psychotic illness characterized by a fixed belief of manifesting a parasite in a body when in reality, it was not. Also known as Ekbom syndrome or delusional infestations, or acarophobia. Although the patient has no primary skin pathology, but all skin findings were secondary to skin manipulation by the patient itself, which is why up to 90% of patients first seek consultation from a dermatologist. Most commonly, it was seen in older people with female to male ratio is 2:1. For treatment, the patient first need to be investigated to rule all other possible causes, as Delusional parasitosis can be caused by Vitamin B12 deficiency, pellagra, hepatic and renal disease, diabetes mellitus, multiple sclerosis, and leprosy. When all possible causes ruled out, psychiatric referral to be done. Rule out other psychiatric comorbidities, and treatment should be done accordingly. Patient with delusional parasitosis responds well to second generation antipsychotics and need to continuous medication over years, and relapse is likely if treatment is stopped. Case Presentation- A 79-year-old female, belonging to lower socio-economic status, presented with complaints of itching sensation with erythematous patches over the scalp and multiple scratch excoriations lesion over the scalp, face and neck from the past 7-8 months. She had a feeling of small insect crawling under her skin and scalp area. To reduce the itching and kill the insect, she would scratch and squeeze her skin repeatedly. When the family tried to give her explanation that there was no insect in her body, she would not get convinced, rather got angry and abuse family members for not believing her. Gradually, her sleep would remain disturbed, she would be seen awake at night, seen to be scratching her skin, pull her scalp hair, even squeeze out her healed lesions. She collected her skin debris, scalp hairs and look out for insect. Because of her continuous illness, the patient started to remain sad and had crying spells. Her appetite decreased. She became socially isolated and stopped doing her activities of daily living. Family member’s first consulted dermatologist, investigated thoroughly with routine investigations, autoimmune and malignancy workup. As all investigations were normal, following which patient was referred for psychiatric evaluation. The patient was started on Tablet Olanzapine 2.5 mg, gradually increased to 7.5 mg. Over 1 month, there was reduction in itching, skin pricking. Lesions were gradually healed, and the patient continued to take other dermatological medications and ointment and was in regular follow up with psychiatric liaison from past 2 months with 70-80 % improvement in her symptoms. Conclusion- Delusional parasitosis is a psychiatric disorder of insidious onset, seen commonly in middle and old age people. Both psychiatric and dermatology consultation liaison will help the patient for an early diagnosis and adequate treatment. If a primary psychiatric diagnosis, the patient respond well to second generation antipsychotics but always require a further evaluation and treatment management if it is secondary to some physical or other psychiatric comorbidity.Keywords: delusional parasitosis, delusional infestations, rare, primary psychiatric diagnosis, antipsychotic agents
Procedia PDF Downloads 8327 Automated System: Managing the Production and Distribution of Radiopharmaceuticals
Authors: Shayma Mohammed, Adel Trabelsi
Abstract:
Radiopharmacy is the art of preparing high-quality, radioactive, medicinal products for use in diagnosis and therapy. Radiopharmaceuticals unlike normal medicines, this dual aspect (radioactive, medical) makes their management highly critical. One of the most convincing applications of modern technologies is the ability to delegate the execution of repetitive tasks to programming scripts. Automation has found its way to the most skilled jobs, to improve the company's overall performance by allowing human workers to focus on more important tasks than document filling. This project aims to contribute to implement a comprehensive system to insure rigorous management of radiopharmaceuticals through the use of a platform that links the Nuclear Medicine Service Management System to the Nuclear Radio-pharmacy Management System in accordance with the recommendations of World Health Organization (WHO) and International Atomic Energy Agency (IAEA). In this project we attempt to build a web application that targets radiopharmacies, the platform is built atop the inherently compatible web stack which allows it to work in virtually any environment. Different technologies are used in this project (PHP, Symfony, MySQL Workbench, Bootstrap, Angular 7, Visual Studio Code and TypeScript). The operating principle of the platform is mainly based on two parts: Radiopharmaceutical Backoffice for the Radiopharmacian, who is responsible for the realization of radiopharmaceutical preparations and their delivery and Medical Backoffice for the Doctor, who holds the authorization for the possession and use of radionuclides and he/she is responsible for ordering radioactive products. The application consists of sven modules: Production, Quality Control/Quality Assurance, Release, General Management, References, Transport and Stock Management. It allows 8 classes of users: The Production Manager (PM), Quality Control Manager (QCM), Stock Manager (SM), General Manager (GM), Client (Doctor), Parking and Transport Manager (PTM), Qualified Person (QP) and Technical and Production Staff. Digital platform bringing together all players involved in the use of radiopharmaceuticals and integrating the stages of preparation, production and distribution, Web technologies, in particular, promise to offer all the benefits of automation while requiring no more than a web browser to act as a user client, which is a strength because the web stack is by nature multi-platform. This platform will provide a traceability system for radiopharmaceuticals products to ensure the safety and radioprotection of actors and of patients. The new integrated platform is an alternative to write all the boilerplate paperwork manually, which is a tedious and error-prone task. It would minimize manual human manipulation, which has proven to be the main source of error in nuclear medicine. A codified electronic transfer of information from radiopharmaceutical preparation to delivery will further reduce the risk of maladministration.Keywords: automated system, management, radiopharmacy, technical papers
Procedia PDF Downloads 15826 Revolutionizing Accounting: Unleashing the Power of Artificial Intelligence
Authors: Sogand Barghi
Abstract:
The integration of artificial intelligence (AI) in accounting practices is reshaping the landscape of financial management. This paper explores the innovative applications of AI in the realm of accounting, emphasizing its transformative impact on efficiency, accuracy, decision-making, and financial insights. By harnessing AI's capabilities in data analysis, pattern recognition, and automation, accounting professionals can redefine their roles, elevate strategic decision-making, and unlock unparalleled value for businesses. This paper delves into AI-driven solutions such as automated data entry, fraud detection, predictive analytics, and intelligent financial reporting, highlighting their potential to revolutionize the accounting profession. Artificial intelligence has swiftly emerged as a game-changer across industries, and accounting is no exception. This paper seeks to illuminate the profound ways in which AI is reshaping accounting practices, transcending conventional boundaries, and propelling the profession toward a new era of efficiency and insight-driven decision-making. One of the most impactful applications of AI in accounting is automation. Tasks that were once labor-intensive and time-consuming, such as data entry and reconciliation, can now be streamlined through AI-driven algorithms. This not only reduces the risk of errors but also allows accountants to allocate their valuable time to more strategic and analytical tasks. AI's ability to analyze vast amounts of data in real time enables it to detect irregularities and anomalies that might go unnoticed by traditional methods. Fraud detection algorithms can continuously monitor financial transactions, flagging any suspicious patterns and thereby bolstering financial security. AI-driven predictive analytics can forecast future financial trends based on historical data and market variables. This empowers organizations to make informed decisions, optimize resource allocation, and develop proactive strategies that enhance profitability and sustainability. Traditional financial reporting often involves extensive manual effort and data manipulation. With AI, reporting becomes more intelligent and intuitive. Automated report generation not only saves time but also ensures accuracy and consistency in financial statements. While the potential benefits of AI in accounting are undeniable, there are challenges to address. Data privacy and security concerns, the need for continuous learning to keep up with evolving AI technologies, and potential biases within algorithms demand careful attention. The convergence of AI and accounting marks a pivotal juncture in the evolution of financial management. By harnessing the capabilities of AI, accounting professionals can transcend routine tasks, becoming strategic advisors and data-driven decision-makers. The applications discussed in this paper underline the transformative power of AI, setting the stage for an accounting landscape that is smarter, more efficient, and more insightful than ever before. The future of accounting is here, and it's driven by artificial intelligence.Keywords: artificial intelligence, accounting, automation, predictive analytics, financial reporting
Procedia PDF Downloads 7225 Partially Aminated Polyacrylamide Hydrogel: A Novel Approach for Temporary Oil and Gas Well Abandonment
Authors: Hamed Movahedi, Nicolas Bovet, Henning Friis Poulsen
Abstract:
Following the advent of the Industrial Revolution, there has been a significant increase in the extraction and utilization of hydrocarbon and fossil fuel resources. However, a new era has emerged, characterized by a shift towards sustainable practices, namely the reduction of carbon emissions and the promotion of renewable energy generation. Given the substantial number of mature oil and gas wells that have been developed inside the petroleum reservoir domain, it is imperative to establish an environmental strategy and adopt appropriate measures to effectively seal and decommission these wells. In general, the cement plug serves as a material for plugging purposes. Nevertheless, there exist some scenarios in which the durability of such a plug is compromised, leading to the potential escape of hydrocarbons via fissures and fractures within cement plugs. Furthermore, cement is often not considered a practical solution for temporary plugging, particularly in the case of well sites that have the potential for future gas storage or CO2 injection. The Danish oil and gas industry has promising potential as a prospective candidate for future carbon dioxide (CO2) injection, hence contributing to the implementation of carbon capture strategies within Europe. The primary reservoir component consists of chalk, a rock characterized by limited permeability. This work focuses on the development and characterization of a novel hydrogel variant. The hydrogel is designed to be injected via a low-permeability reservoir and afterward undergoes a transformation into a high-viscosity gel. The primary objective of this research is to explore the potential of this hydrogel as a new solution for effectively plugging well flow. Initially, the synthesis of polyacrylamide was carried out using radical polymerization inside the confines of the reaction flask. Subsequently, with the application of the Hoffman rearrangement, the polymer chain undergoes partial amination, facilitating its subsequent reaction with the crosslinker and enabling the formation of a hydrogel in the subsequent stage. The organic crosslinker, glutaraldehyde, was employed in the experiment to facilitate the formation of a gel. This gel formation occurred when the polymeric solution was subjected to heat within a specified range of reservoir temperatures. Additionally, a rheological survey and gel time measurements were conducted on several polymeric solutions to determine the optimal concentration. The findings indicate that the gel duration is contingent upon the starting concentration and exhibits a range of 4 to 20 hours, hence allowing for manipulation to accommodate diverse injection strategies. Moreover, the findings indicate that the gel may be generated in environments characterized by acidity and high salinity. This property ensures the suitability of this substance for application in challenging reservoir conditions. The rheological investigation indicates that the polymeric solution exhibits the characteristics of a Herschel-Bulkley fluid with somewhat elevated yield stress prior to solidification.Keywords: polyacrylamide, hofmann rearrangement, rheology, gel time
Procedia PDF Downloads 7924 An Early Intervention Framework for Supporting Students’ Mathematical Development in the Transition to University STEM Programmes
Authors: Richard Harrison
Abstract:
Developing competency in mathematics and related critical thinking skills is essential to the education of undergraduate students of Science, Technology, Engineering and Mathematics (STEM). Recently, the HE sector has been impacted by a seemingly widening disconnect between the mathematical competency of incoming first-year STEM students and their entrance qualification tariffs. Despite relatively high grades in A-Level Mathematics, students may initially lack fundamental skills in key areas such as algebraic manipulation and have limited capacity to apply problem solving strategies. Compounded by compensatory measures applied to entrance qualifications during the pandemic, there has been an associated decline in student performance on introductory university mathematics modules. In the UK, a number of online resources have been developed to help scaffold the transition to university mathematics. However, in general, these do not offer a structured learning journey focused on individual developmental needs, nor do they offer an experience coherent with the teaching and learning characteristics of the destination institution. In order to address some of these issues, a bespoke framework has been designed and implemented on our VLE in the Faculty of Engineering & Physical Sciences (FEPS) at the University of Surrey. Called the FEPS Maths Support Framework, it was conceived to scaffold the mathematical development of individuals prior to entering the university and during the early stages of their transition to undergraduate studies. More than 90% of our incoming STEM students voluntarily participate in the process. Students complete a set of initial diagnostic questions in the late summer. Based on their performance and feedback on these questions, they are subsequently guided to self-select specific mathematical topic areas for review using our proprietary resources. This further assists students in preparing for discipline related diagnostic tests. The framework helps to identify students who are mathematically weak and facilitates early intervention to support students according to their specific developmental needs. This paper presents a summary of results from a rich data set captured from the framework over a 3-year period. Quantitative data provides evidence that students have engaged and developed during the process. This is further supported by process evaluation feedback from the students. Ranked performance data associated with seven key mathematical topic areas and eight engineering and science discipline areas reveals interesting patterns which can be used to identify more generic relative capabilities of the discipline area cohorts. In turn, this facilitates evidence based management of the mathematical development of the new cohort, informing any associated adjustments to teaching and learning at a more holistic level. Evidence is presented establishing our framework as an effective early intervention strategy for addressing the sector-wide issue of supporting the mathematical development of STEM students transitioning to HEKeywords: competency, development, intervention, scaffolding
Procedia PDF Downloads 6723 Disrupting Traditional Industries: A Scenario-Based Experiment on How Blockchain-Enabled Trust and Transparency Transform Nonprofit Organizations
Authors: Michael Mertel, Lars Friedrich, Kai-Ingo Voigt
Abstract:
Based on principle-agent theory, an information asymmetry exists in the traditional donation process. Consumers cannot comprehend whether nonprofit organizations (NPOs) use raised funds according to the designated cause after the transaction took place (hidden action). Therefore, charity organizations have tried to appear transparent and gain trust by using the same marketing instruments for decades (e.g., releasing project success reports). However, none of these measures can guarantee consumers that charities will use their donations for the purpose. With awareness of misuse of donations rising due to the Ukraine conflict (e.g., funding crime), consumers are increasingly concerned about the destination of their charitable purposes. Therefore, innovative charities like the Human Rights Foundation have started to offer donations via blockchain. Blockchain technology has the potential to establish profound trust and transparency in the donation process: Consumers can publicly track the progress of their donation at any time after deciding to donate. This ensures that the charity is not using donations against its original intent. Hence, the aim is to investigate the effect of blockchain-enabled transactions on the willingness to donate. Sample and Design: To investigate consumers' behavior, we use a scenario-based experiment. After removing participants (e.g., due to failed attention checks), 3192 potential donors participated (47.9% female, 62.4% bachelor or above). Procedure: We randomly assigned the participants to one of two scenarios. In all conditions, the participants read a scenario about a fictive charity organization called "Helper NPO." Afterward, the participants answered questions regarding their perception of the charity. Manipulation: The first scenario (n = 1405) represents a typical donation process, where consumers donate money without any option to track and trace. The second scenario (n = 1787) represents a donation process via blockchain, where consumers can track and trace their donations respectively. Using t-statistics, the findings demonstrate a positive effect of donating via blockchain on participants’ willingness to donate (mean difference = 0.667, p < .001, Cohen’s d effect size = 0.482). A mediation analysis shows significant effects for the mediation of transparency (Estimate = 0.199, p < .001), trust (Estimate = 0.144, p < .001), and transparency and trust (Estimate = 0.158, p < .001). The total effect of blockchain usage on participants’ willingness to donate (Estimate = 0.690, p < .001) consists of the direct effect (Estimate = 0.189, p < .001) and the indirect effects of transparency and trust (Estimate = 0.501, p < .001). Furthermore, consumers' affinity for technology moderates the direct effect of blockchain usage on participants' willingness to donate (Estimate = 0.150, p < .001). Donating via blockchain is a promising way for charities to engage consumers for several reasons: (1) Charities can emphasize trust and transparency in their advertising campaigns. (2) Established charities can target new customer segments by specifically engaging technology-affine consumers in the future. (3) Charities can raise international funds without previous barriers (e.g., setting up bank accounts). Nevertheless, increased transparency can also backfire (e.g., disclosure of costs). Such cases require further research.Keywords: blockchain, social sector, transparency, trust
Procedia PDF Downloads 10122 Ankle Fracture Management: A Unique Cross Departmental Quality Improvement Project
Authors: Langhit Kurar, Loren Charles
Abstract:
Introduction: In light of recent BOAST 12 (August 2016) published guidance on management of ankle fractures, the project aimed to highlight key discrepancies throughout the care trajectory from admission to point of discharge at a district general hospital. Wide breadth of data covering three key domains: accident and emergency, radiology, and orthopaedic surgery were subsequently stratified and recommendations on note documentation, and outpatient follow up were made. Methods: A retrospective twelve month audit was conducted reviewing results of ankle fracture management in 37 patients. Inclusion criterion involved all patients seen at Darent Valley Hospital (DVH) emergency department with radiographic evidence of an ankle fracture. Exclusion criterion involved all patients managed solely by nursing staff or having sustained purely ligamentous injury. Medical notes, including discharge summaries and the PACS online radiographic tool were used for data extraction. Results: Cross-examination of the A & E domain revealed limited awareness of the BOAST 12 recent publication including requirements to document skin integrity and neurovascular assessment. This had direct implications as this would have changed the surgical plan for acutely compromised patients. The majority of results obtained from the radiographic domain were satisfactory with appropriate X-rays taken in over 95% of cases. However, due to time pressures within A & E, patients were often left without a post manipulation XRAY in a backslab. Poorly reduced fractures were subsequently left for a long period resulting in swollen ankles and a time-dependent lag to surgical intervention. This had knocked on implications for prolonged inpatient stay resulting in hospital-acquired co-morbidity including pressure sores. Discussion: The audit has highlighted several areas of improvement throughout the disease trajectory from review in the emergency department to follow up as an outpatient. This has prompted the creation of an algorithm to ensure patients with significant fractures presenting to the emergency department are seen promptly and treatment expedited as per recent guidance. This includes timing for X-rays taken in A & E. Re-audit has shown significant improvement in both documentation at time of presentation and appropriate follow-up strategies. Within the orthopedic domain, we are in the process of creating an ankle fracture pathway to ensure imaging and weight bearing status are made clear to the consulting clinicians in an outpatient setting. Significance/Clinical Relevance: As a result of the ankle fracture algorithm we have adapted the BOAST 12 guidance to shape an intrinsic pathway to not only improve patient management within the emergency department but also create a standardised format for follow up.Keywords: ankle, fracture, BOAST, radiology
Procedia PDF Downloads 18021 How Whatsappization of the Chatbot Affects User Satisfaction, Trust, and Acceptance in a Drive-Sharing Task
Authors: Nirit Gavish, Rotem Halutz, Liad Neta
Abstract:
Nowadays, chatbots are gaining more and more attention due to the advent of large language models. One of the important considerations in chatbot design is how to create an interface to achieve high user satisfaction, trust, and acceptance. Since WhatsApp conversations sometimes substitute for face-to-face communication, we studied whether WhatsAppization of the chatbot -making the conversation resemble a WhatsApp conversation more- will improve user satisfaction, trust, and acceptance, or whether the opposite will occur due to the Uncanny Valley (UV) effect. The task was a drive-sharing task, in which participants communicated with a textual chatbot via WhatsApp and could decide whether to participate in a ride to college with a driver suggested by the chatbot. WhatsAppization of the chatbot was done in two ways: By a dialog-style conversation (Dialog versus No Dialog), and by adding WhatsApp indicators – “Last Seen”, “Connected”, “Read Receipts”, and “Typing…” (Indicators versus No Indicators). Our 120 participants were randomly assigned to one of the four 2 by 2 design groups, with 30 participants in each. They interacted with the WhatsApp chatbot and then filled out a questionnaire. The results demonstrated that, as expected from the manipulation, the interaction with the chatbot was longer for the dialog condition compared to the no dialog. This extra interaction, however, did not lead to higher acceptance -quite the opposite, since participants in the dialog condition were less willing to implement the decision made at the end of the conversation with the chatbot and continue the interaction with the driver they chose. The results are even more striking when considering the Indicators condition. Both for the satisfaction measures and the trust measures, participants’ ratings were lower in the Indicators condition compared to the No Indicators. Participants in the Indicators condition felt that the ride search process was harder to operate, and slower (even though the actual interaction time was similar). They were less convinced that the chatbot suggested real trips and they trusted the person offering the ride and referred to them by the chatbot less. These effects were more evident for participants who preferred to share their rides using WhatsApp compared to participants who preferred chatbots for that purpose. Considering our findings, we can say that the WhatsAppization of the chatbot was detrimental. This is true for the both chatbot WhatsAppization methods – by making the conversation more a dialog and adding WhatsApp indicators. For the chosen drive-sharing task, the results were, in addition to lower satisfaction, less trust in the chatbot’s suggestion and even in the driver suggested by the chatbot, and lower willingness to actually undertake the suggested ride. In addition, it seems that the most problematic WhatsAppization method was using WhatsApp’s indicators during the interaction with the chatbot. The current study suggests that a conversation with an artificial agent should also not imitate a WhatsApp conversation very closely. With the proliferation of WhatsApp use, the emotional and social aspect of face-to face commination are moving to WhatsApp communication. Based on the current study’s findings, it is possible that the UV effect also occurs in WhatsAppization, and not only in humanization, of the chatbot, with a similar feeling of eeriness, and is more pronounced for people who prefer to use WhatsApp over chatbots. The current research can serve as a starting point to study the very interesting and important topic of chatbots WhatsAppization. More methods of WhatsAppization and other tasks could be the focus of further studies.Keywords: chatbot, WhatsApp, humanization, Uncanny Valley, drive sharing
Procedia PDF Downloads 5020 Characteristics of Female Offenders: Using Childhood Victimization Model for Treatment
Authors: Jane E. Hill
Abstract:
Sexual, physical, or emotional abuses are behaviors used by one person in a relationship or within a family unit to control the other person. Physical abuse can consist of, but not limited to hitting, pushing, and shoving. Sexual abuse is unwanted or forced sexual activity on a person without their consent. Abusive behaviors include intimidation, manipulation, humiliation, isolation, frightening, terrorizing, coercing, threatening, blaming, hurting, injuring, or wounding another individual. Although emotional, psychological and financial abuses are not criminal behaviors, they are forms of abuse and can leave emotional scars on their victim. The purpose of this literature review research was to examine characteristics of female offenders, past abuse, and pathways to offending. The question that guided this research: does past abuse influence recidivism? The theoretical foundation used was relational theory by Jean Baker Miller. One common feature of female offenders is abuse (sexual, physical, or verbal). Abuse can cause mental illnesses and substance abuse. The abuse does not directly affect the women's recidivism. However, results indicated the psychological and maladaptive behaviors as a result of the abuse did contribute to indirect pathways to continue offending. The female offenders’ symptoms of ongoing depression, anxiety, and engaging in substance abuse (self medicating) did lead to the women's incarceration. Using the childhood victimization model as the treatment approach for women's mental illness and substance abuse disorders that were a result from history of child abuse have shown success. With that in mind, if issues surrounding early victimization are not addressed, then the women offenders may not recover from their mental illness or addiction and are at a higher risk of reoffending. However, if the women are not emotionally ready to engage in the treatment process, then it should not be forced onto them because it may cause harm (targeting prior traumatic experiences). Social capital is family support and sources that assist in helping the individual with education, employment opportunities that can lead to success. Human capital refers to internal knowledge, skills, and capacities that help the individual act in new and appropriate ways. The lack of human and social capital is common among female offenders, which leads to extreme poverty and economic marginalization, more often in frequent numbers than men. In addition, the changes in welfare reform have exacerbated women’s difficulties in gaining adequate-paying jobs to support themselves and their children that have contributed to female offenders reoffending. With that in mind, one way to lower the risk factor of female offenders from reoffending is to provide them with educational and vocational training, enhance their self-efficacy, and teach them appropriate coping skills and life skills. Furthermore, it is important to strengthen family bonds and support. Having a supportive family relationship was a statistically significant protective factor for women offenders.Keywords: characteristics, childhood victimization model, female offenders, treatment
Procedia PDF Downloads 11319 Glucose Measurement in Response to Environmental and Physiological Challenges: Towards a Non-Invasive Approach to Study Stress in Fishes
Authors: Tomas Makaras, Julija Razumienė, Vidutė Gurevičienė, Gintarė Sauliutė, Milda Stankevičiūtė
Abstract:
Stress responses represent animal’s natural reactions to various challenging conditions and could be used as a welfare indicator. Regardless of the wide use of glucose measurements in stress evaluation, there are some inconsistencies in its acceptance as a stress marker, especially when it comes to comparison with non-invasive cortisol measurements in the fish challenging stress. To meet the challenge and to test the reliability and applicability of glucose measurement in practice, in this study, different environmental/anthropogenic exposure scenarios were simulated to provoke chemical-induced stress in fish (14-days exposure to landfill leachate) followed by a 14-days stress recovery period and under the cumulative effect of leachate fish subsequently exposed to pathogenic oomycetes (Saprolegnia parasitica) to represent a possible infection in fish. It is endemic to all freshwater habitats worldwide and is partly responsible for the decline of natural freshwater fish populations. Brown trout (Salmo trutta fario) and sea trout (Salmo trutta trutta) juveniles were chosen because of a large amount of literature on physiological stress responses in these species was known. Glucose content in fish by applying invasive and non-invasive glucose measurement procedures in different test mediums such as fish blood, gill tissues and fish-holding water were analysed. The results indicated that the quantity of glucose released in the holding water of stressed fish increased considerably (approx. 3.5- to 8-fold) and remained substantially higher (approx. 2- to 4-fold) throughout the stress recovery period than the control level suggesting that fish did not recover from chemical-induced stress. The circulating levels of glucose in blood and gills decreased over time in fish exposed to different stressors. However, the gill glucose level in fish showed a decrease similar to the control levels measured at the same time points, which was found to be insignificant. The data analysis showed that concentrations of β-D glucose measured in gills of fish treated with S. parasitica differed significantly from the control recovery, but did not differ from the leachate recovery group showing that S. parasitica presence in water had no additive effects. In contrast, a positive correlation between blood and gills glucose were determined. Parallel trends in blood and water glucose changes suggest that water glucose measurement has much potency in predicting stress. This study demonstrated that measuring β-D-glucose in fish-holding water is not stressful as it involves no handling and manipulation of an organism and has critical technical advantages concerning current (invasive) methods, mainly using blood samples or specific tissues. The quantification of glucose could be essential for studies examining the stress physiology/aquaculture studies interested in the assessment or long-term monitoring of fish health.Keywords: brown trout, landfill leachate, sea trout, pathogenic oomycetes, β-D-glucose
Procedia PDF Downloads 175