Search results for: cover image
548 Electronic Physical Activity Record (EPAR): Key for Data Driven Physical Activity Healthcare Services
Authors: Rishi Kanth Saripalle
Abstract:
Medical experts highly recommend to include physical activity in everyone’s daily routine irrespective of gender or age as it helps to improve various medical issues or curb potential issues. Simultaneously, experts are also diligently trying to provide various healthcare services (interventions, plans, exercise routines, etc.) for promoting healthy living and increasing physical activity in one’s ever increasing hectic schedules. With the introduction of wearables, individuals are able to keep track, analyze, and visualize their daily physical activities. However, there seems to be no common agreed standard for representing, gathering, aggregating and analyzing an individual’s physical activity data from disparate multiple sources (exercise pans, multiple wearables, etc.). This issue makes it highly impractical to develop any data-driven physical activity applications and healthcare programs. Further, the inability to integrate the physical activity data into an individual’s Electronic Health Record to provide a wholistic image of that individual’s health is still eluding the experts. This article has identified three primary reasons for this potential issue. First, there is no agreed standard, both structure and semantic, for representing and sharing physical activity data across disparate systems. Second, various organizations (e.g., LA fitness, Gold’s Gym, etc.) and research backed interventions and programs still primarily rely on paper or unstructured format (such as text or notes) to keep track of the data generated from physical activities. Finally, most of the wearable devices operate in silos. This article identifies the underlying problem, explores the idea of reusing existing standards, and identifies the essential modules required to move forward.Keywords: electronic physical activity record, physical activity in EHR EIM, tracking physical activity data, physical activity data standards
Procedia PDF Downloads 284547 A Multimodal Measurement Approach Using Narratives and Eye Tracking to Investigate Visual Behaviour in Perceiving Naturalistic and Urban Environments
Authors: Khizar Z. Choudhrya, Richard Coles, Salman Qureshi, Robert Ashford, Salim Khan, Rabia R. Mir
Abstract:
Abstract: The majority of existing landscape research has been derived by conducting heuristic evaluations, without having empirical insight of real participant visual response. In this research, a modern multimodal measurement approach (using narratives and eye tracking) was applied to investigate visual behaviour in perceiving naturalistic and urban environments. This research is unique in exploring gaze behaviour on environmental images possessing different levels of saliency. Eye behaviour is predominantly attracted by salient locations. The concept of methodology of this research on naturalistic and urban environments is drawn from the approaches in market research. Borrowing methodologies from market research that examine visual responses and qualities provided a critical and hitherto unexplored approach. This research has been conducted by using mixed methodological quantitative and qualitative approaches. On the whole, the results of this research corroborated existing landscape research findings, but they also identified potential refinements. The research contributes both methodologically and empirically to human-environment interaction (HEI). This study focused on initial impressions of environmental images with the help of eye tracking. Taking under consideration the importance of the image, this study explored the factors that influence initial fixations in relation to expectations and preferences. In terms of key findings of this research it is noticed that each participant has his own unique navigation style while surfing through different elements of landscape images. This individual navigation style is given the name of ‘visual signature’. This study adds the necessary clarity that would complete the picture and bring an insight for future landscape researchers.Keywords: human-environment interaction (HEI), multimodal measurement, narratives, eye tracking
Procedia PDF Downloads 339546 Use of Extended Conversation to Boost Vocabulary Knowledge and Soft Skills in English for Employment Classes
Authors: James G. Matthew, Seonmin Huh, Frank X. Bennett
Abstract:
English for Specific Purposes, ESP, aims to equip learners with necessary English language skills. Many ESP programs address language skills for job performance, including reading job related documents and oral proficiency. Within ESP is English for occupational purposes, EOP, which centers around developing communicative competence for the globalized workplace. Many ESP and EOP courses lack the content needed to assist students to progress at work, resulting in the need to create lexical compilation for different professions. It is important to teach communicative competence and soft skills for real job-related problem situations and address the complexities of the real world to help students to be successful in their professions. ESP and EOP research is therefore trying to balance both profession-specific educational contents as well as international multi-disciplinary language skills for the globalized workforce. The current study will build upon the existing discussion by developing pedagogy to assist students in their career through developing a strong practical command of relevant English vocabulary. Our research question focuses on the pedagogy two professors incorporated in their English for employment courses. The current study is a qualitative case study on the modes of teaching delivery for EOP in South Korea. Two foreign professors teaching at two different universities in South Korea volunteered for the study to explore their teaching practices. Both professors’ curriculums included the components of employment-related concept vocabulary, business presentations, CV/resume and cover letter preparation, and job interview preparation. All the pre-made recorded video lectures, live online class sessions with students, teachers’ lesson plans, teachers’ class materials, students’ assignments, and midterm and finals video conferences were collected for data analysis. The study then focused on unpacking representative patterns in their teaching methods. The professors used their strengths as native speakers to extend the class discussion from narrow and restricted conversations to giving students broader opportunities to practice authentic English conversation. The methods of teaching utilized three main steps to extend the conversation. Firstly, students were taught concept vocabulary. Secondly, the vocabulary was then combined in speaking activities where students had to solve scenarios, and the students were required to expand on the given forms of words and language expressions. Lastly, the students had conversations in English, using the language learnt. The conversations observed in both classes were those of authentic, expanded English communication and this way of expanding concept vocabulary lessons into extended conversation is one representative pedagogical approach that both professors took. Extended English conversation, therefore, is crucial for EOP education.Keywords: concept vocabulary, english as a foreign language, english for employment, extended conversation
Procedia PDF Downloads 92545 Multi-source Question Answering Framework Using Transformers for Attribute Extraction
Authors: Prashanth Pillai, Purnaprajna Mangsuli
Abstract:
Oil exploration and production companies invest considerable time and efforts to extract essential well attributes (like well status, surface, and target coordinates, wellbore depths, event timelines, etc.) from unstructured data sources like technical reports, which are often non-standardized, multimodal, and highly domain-specific by nature. It is also important to consider the context when extracting attribute values from reports that contain information on multiple wells/wellbores. Moreover, semantically similar information may often be depicted in different data syntax representations across multiple pages and document sources. We propose a hierarchical multi-source fact extraction workflow based on a deep learning framework to extract essential well attributes at scale. An information retrieval module based on the transformer architecture was used to rank relevant pages in a document source utilizing the page image embeddings and semantic text embeddings. A question answering framework utilizingLayoutLM transformer was used to extract attribute-value pairs incorporating the text semantics and layout information from top relevant pages in a document. To better handle context while dealing with multi-well reports, we incorporate a dynamic query generation module to resolve ambiguities. The extracted attribute information from various pages and documents are standardized to a common representation using a parser module to facilitate information comparison and aggregation. Finally, we use a probabilistic approach to fuse information extracted from multiple sources into a coherent well record. The applicability of the proposed approach and related performance was studied on several real-life well technical reports.Keywords: natural language processing, deep learning, transformers, information retrieval
Procedia PDF Downloads 193544 Averting a Financial Crisis through Regulation, Including Legislation
Authors: Maria Krambia-Kapardis, Andreas Kapardis
Abstract:
The paper discusses regulatory and legislative measures implemented by various nations in an effort to avert another financial crisis. More specifically, to address the financial crisis, the European Commission followed the practice of other developed countries and implemented a European Economic Recovery Plan in an attempt to overhaul the regulatory and supervisory framework of the financial sector. In 2010 the Commission introduced the European Systemic Risk Board and in 2011 the European System of Financial Supervision. Some experts advocated that the type and extent of financial regulation introduced in the European crisis in the wake of the 2008 crisis has been excessive and counterproductive. In considering how different countries responded to the financial crisis, global regulators have shown a more focused commitment to combat industry misconduct and to pre-empt abusive behavior. Regulators have also increased funding and resources at their disposal; have increased regulatory fines, with an increasing trend towards action against individuals; and, finally, have focused on market abuse and market conduct issues. Financial regulation can be effected, first of all, through legislation. However, neither ex ante or ex post regulation is by itself effective in reducing systemic risk. Consequently, to avert a financial crisis, in their endeavor to achieve both economic efficiency and financial stability, governments need to balance the two approaches to financial regulation. Fiduciary duty is another means by which the behavior of actors in the financial world is constrained and, thus, regulated. Furthermore, fiduciary duties extend over and above other existing requirements set out by statute and/or common law and cover allegations of breach of fiduciary duty, negligence or fraud. Careful analysis of the etiology of the 2008 financial crisis demonstrates the great importance of corporate governance as a way of regulating boardroom behavior. In addition, the regulation of professions including accountants and auditors plays a crucial role as far as the financial management of companies is concerned. In the US, the Sarbanes-Oxley Act of 2002 established the Public Company Accounting Oversight Board in order to protect investors from financial accounting fraud. In most countries around the world, however, accounting regulation consists of a legal framework, international standards, education, and licensure. Accounting regulation is necessary because of the information asymmetry and the conflict of interest that exists between managers and users of financial information. If a holistic approach is to be taken then one cannot ignore the regulation of legislators themselves which can take the form of hard or soft legislation. The science of averting a financial crisis is yet to be perfected and this, as shown by the preceding discussion, is unlikely to be achieved in the foreseeable future as ‘disaster myopia’ may be reduced but will not be eliminated. It is easier, of course, to be wise in hindsight and regulating unreasonably risky decisions and unethical or outright criminal behavior in the financial world remains major challenges for governments, corporations, and professions alike.Keywords: financial crisis, legislation, regulation, financial regulation
Procedia PDF Downloads 400543 Social Networks in a Communication Strategy of a Large Company
Authors: Kherbache Mehdi
Abstract:
Within the framework of the validation of the Master in business administration marketing and sales in INSIM institute international in management Blida, we get the opportunity to do a professional internship in Sonelgaz Enterprise and a thesis. The thesis deals with the integration of social networking in the communication strategy of a company. The problematic is: How communicate with social network can be a solution for companies? The challenges stressed by this thesis were to suggest limits and recommendations to Sonelgaz Enterprise concerning social networks. The whole social networks represent more than a billion people as a potential target for the companies. Thanks to research and a qualitative approach, we have identified tree valid hypothesis. The first hypothesis allows confirming that using social networks cannot be ignored by any company in its communication strategy. However, the second hypothesis demonstrates that it’s necessary to prepare a strategy that integrates social networks in the communication plan of the company. The risk of this strategy is very limited because failure on social networks is not a restraint for the enterprise, social networking is not expensive and, a bad image which could result from it is not as important in the long-term. Furthermore, the return on investment is difficult to evaluate. Finally, the last hypothesis shows that firms establish a new relation between consumers and brands thanks to the proximity allowed by social networks. After the validation of the hypothesis, we suggested some recommendations to Sonelgaz Enterprise regarding the communication through social networks. Firstly, the company must use the interactivity of social network in order to have fruitful exchanges with the community. We also recommended having a strategy to treat negative comments. The company must also suggest delivering resources to the community thanks to a community manager, in order to have a good relation with the community. Furthermore, we advised using social networks to do business intelligence. Sonelgaz Enterprise can have some creative and interactive contents with some amazing applications on Facebook for example. Finally, we recommended to the company to be not intrusive with “fans” or “followers” and to be open to all the platforms: Twitter, Facebook, Linked-In for example.Keywords: social network, buzz, communication, consumer, return on investment, internet users, web 2.0, Facebook, Twitter, interaction
Procedia PDF Downloads 424542 An Audit on the Quality of Pre-Operative Intra-Oral Digital Radiographs Taken for Dental Extractions in a General Practice Setting
Authors: Gabrielle O'Donoghue
Abstract:
Background: Pre-operative radiographs facilitate assessment and treatment planning in minor oral surgery. Quality assurance for dental radiography advocates the As Low As Reasonably Achievable (ALARA) principle in collecting accurate diagnostic information. Aims: To audit the quality of digital intraoral periapicals (IOPAs) taken prior to dental extractions in a metropolitan general dental practice setting. Standards: The National Radiological Protection Board (NRPB) guidance outlines three grades of radiograph quality: excellent (Grade 1 > 70% of total exposures), diagnostically acceptable (Grade 2 <20%), and unacceptable (Grade 3 <10%). Methodology: A study of pre-operative radiographs taken prior to dental extractions across 12 private general dental practices in a large metropolitan area by 44 practitioners. A total of 725 extractions were assessed, allowing 258 IOPAs to be reviewed in one audit cycle. Results: First cycle: Of 258 IOPAs: 223(86.4%) scored Grade 1, 27(10.5%) Grade 2, and 8(3.1%) Grade 3. The standard was met. 35 dental extractions were performed without an available pre-operative radiograph. Action Plan & Recommendations: Results were distributed to all staff and a continuous professional development evening organized to outline recommendations to improve image quality. A second audit cycle is proposed at a six-month interval to review the recommendations and appraise results. Conclusion: The overall standard of radiographs met the published guidelines. A significant improvement in the number of procedures undertaken without pre-operative imaging is expected at a six-month interval period. An investigation into undiagnostic imaging and associated adverse patient outcomes is being considered. Maintenance of the standards achieved is predicted in the second audit cycle to ensure consistent high quality imaging.Keywords: audit, oral radiology, oral surgery, periapical radiographs, quality assurance
Procedia PDF Downloads 166541 Relationship between Gully Development and Characteristics of Drainage Area in Semi-Arid Region, NW Iran
Authors: Ali Reza Vaezi, Ouldouz Bakhshi Rad
Abstract:
Gully erosion is a widespread and often dramatic form of soil erosion caused by water during and immediately after heavy rainfall. It occurs when flowing surface water is channelled across unprotected land and washes away the soil along the drainage lines. The formation of gully is influenced by various factors, including climate, drainage surface area, slope gradient, vegetation cover, land use, and soil properties. It is a very important problem in semi-arid regions, where soils have lower organic matter and are weakly aggregated. Intensive agriculture and tillage along the slope can accelerate soil erosion by water in the region. There is little information on the development of gully erosion in agricultural rainfed areas. Therefore, this study was carried out to investigate the relationship between gully erosion and morphometric characteristics of the drainage area and the effects of soil properties and soil management factors (land use and tillage method) on gully development. A field study was done in a 900 km2 agricultural area in Hshtroud township located in the south of East Azarbijan province, NW Iran. Toward this, two hundred twenty-two gullies created in rainfed lands were found in the area. Some properties of gullies, consisting of length, width, depth, height difference, cross section area, and volume, were determined. Drainage areas for each or some gullies were determined, and their boundaries were drawn. Additionally, the surface area of each drainage, land use, tillage direction, and soil properties that may affect gully formation were determined. The soil erodibility factor (K) defined in the Universal Soil Loss Equation (USLE) was estimated based on five soil properties (silt and very fine sand, coarse sand, organic matter, soil structure code, and soil permeability). Gully development in each drainage area was quantified using its volume and soil loss. The dependency of gully development on drainage area characteristics (surface area, land use, tillage direction, and soil properties) was determined using correlation matrix analysis. Based on the results, gully length was the most important morphometric characteristic indicating the development of gully erosion in the lands. Gully development in the area was related to slope gradient (r= -0.26), surface area (r= 0.71), the area of rainfed lands (r= 0.23), and the area of rainfed tilled along the slope (r= 0.24). Nevertheless, its correlation with the area of pasture and soil erodibility factor (K) was not significant. Among the characteristics of drainage area, surface area is the major factor controlling gully volume in the agricultural land. No significant correlation was found between gully erosion and soil erodibility factor (K) estimated by the Universal Soil Loss Equation (USLE). It seems the estimated soil erodibility can’t describe the susceptibility of the study soils to the gully erosion process. In these soils, aggregate stability and soil permeability are the two soil physical properties that affect the actual soil erodibility and in consequence, these soil properties can control gully erosion in the rainfed lands.Keywords: agricultural area, gully properties, soil structure, USLE
Procedia PDF Downloads 78540 Metamaterial Lenses for Microwave Cancer Hyperthermia Treatment
Authors: Akram Boubakri, Fethi Choubani, Tan Hoa Vuong, Jacques David
Abstract:
Nowadays, microwave hyperthermia is considered as an effective treatment for the malignant tumors. This microwave treatment which comes to substitute the chemotherapy and the surgical intervention enables an in-depth tumor heating without causing any diseases to the sane tissue. This technique requires a high precision system, in order to effectively concentrate the heating just in the tumor, without heating any surrounding healthy tissue. In the hyperthermia treatment, the temperature in cancerous area is typically raised up to over 42◦C and maintained for one hour in order to destroy the tumor sufficiently, whilst in the surrounding healthy tissues, the temperature is maintained below 42◦C to avoid any damage. Metamaterial lenses are widely used in medical applications like microwave hyperthermia treatment. They enabled a subdiffraction resolution thanks to the amplification of the evanescent waves and they can focus electromagnetic waves from a point source to a point image. Metasurfaces have been used to built metamaterial lenses. The main mechanical advantages of those structures over three dimensional material structures are ease of fabrication and a smaller required volume. Here in this work, we proposed a metasurface based lens operating at the frequency of 6 GHz and designed for microwave hyperthermia. This lens was applied and showed good results in focusing and heating the tumor inside a breast tissue with an increased and maintained temperature above 42°C. The tumor was placed in the focal distance of the lens so that only the tumor tissue will be heated. Finally, in this work, it has been shown that the hyperthermia area within the tissue can be carefully adjusted by moving the antennas or by changing the thickness of the metamaterial lenses based on the tumor position. Even though the simulations performed in this work have taken into account an ideal case, some real characteristics can be considered to improve the obtained results in a realistic model.Keywords: focusing, hyperthermia, metamaterial lenses, metasurface, microwave treatment
Procedia PDF Downloads 227539 Study on Changes of Land Use impacting the Process of Urbanization, by Using Landsat Data in African Regions: A Case Study in Kigali, Rwanda
Authors: Delphine Mukaneza, Lin Qiao, Wang Pengxin, Li Yan, Chen Yingyi
Abstract:
Human activities on land use make the land-cover gradually change or transit. In this study, we examined the use of Landsat TM data to detect the land use change of Kigali between 1987 and 2009 using remote sensing techniques and analysis of data using ENVI and ArcGIS, a GIS software. Six different categories of land use were distinguished: bare soil, built up land, wetland, water, vegetation, and others. With remote sensing techniques, we analyzed land use data in 1987, 1999 and 2009, changed areas were found and a dynamic situation of land use in Kigali city was found during the 22 years studied. According to relevant Landsat data, the research focused on land use change in accordance with the role of remote sensing in the process of urbanization. The result of the work has shown the rapid increase of built up land between 1987 and 1999 and a big decrease of vegetation caused by the rebuild of the city after the 1994 genocide, while in the period of 1999 to 2009 there was a reduction in built up land and vegetation, after the authority of Kigali city established, a Master Plan where all constructions which were not in the range of the master Plan were destroyed. Rwanda's capital, Kigali City, through the expansion of the urban area, it is increasing the internal employment rate and attracts business investors and the service sector to improve their economy, which will increase the population growth and provide a better life. The overall planning of the city of Kigali considers the environment, land use, infrastructure, cultural and socio-economic factors, the economic development and population forecast, urban development, and constraints specification. To achieve the above purpose, the Government has set for the overall planning of city Kigali, different stages of the detailed description of the design, strategy and action plan that would guide Kigali planners and members of the public in the future to have more detailed regional plans and practical measures. Thus, land use change is significantly the performance of Kigali active human area, which plays an important role for the country to take certain decisions. Another area to take into account is the natural situation of Kigali city. Agriculture in the region does not occupy a dominant position, and with the population growth and socio-economic development, the construction area will gradually rise and speed up the process of urbanization. Thus, as a developing country, Rwanda's population continues to grow and there is low rate of utilization of land, where urbanization remains low. As mentioned earlier, the 1994 genocide massacres, population growth and urbanization processes, have been the factors driving the dramatic changes in land use. The focus on further research would be on analysis of Rwanda’s natural resources, social and economic factors that could be, the driving force of land use change.Keywords: land use change, urbanization, Kigali City, Landsat
Procedia PDF Downloads 309538 Uncanny Orania: White Complicity as the Abject of the Discursive Construction of Racism
Authors: Daphne Fietz
Abstract:
This paper builds on a reflection on an autobiographical experience of uncanniness during fieldwork in the white Afrikaner settlement Orania in South Africa. Drawing on Kristeva’s theory of abjection to establish a theory of Whiteness which is based on boundary threats, it is argued that the uncanny experience as the emergence of the abject points to a moment of crisis of the author’s Whiteness. The emanating abject directs the author to her closeness or convergence with Orania's inhabitants, that is a reciprocity based on mutual Whiteness. The experienced confluence appeals to the author’s White complicity to racism. With recourse to Butler’s theory of subjectivation, the abject, White complicity, inhabits both the outside of a discourse on racism, and of the 'self', as 'I' establish myself in relation to discourse. In this view, the qualities of the experienced abject are linked to the abject of discourse on racism, or, in other words, its frames of intelligibility. It then becomes clear, that discourse on (overt) racism functions as a necessary counter-image through which White morality is established instead of questioned, because here, by White reasoning, the abject of complicity to racism is successfully repressed, curbed, as completely impossible in the binary construction. Hence, such discourse endangers a preservation of racism in its pre-discursive and structural forms as long as its critique does not encompass its own location and performance in discourse. Discourse on overt racism is indispensable to White ignorance as it covers underlying racism and pre-empts further critique. This understanding directs us towards a form of critique which does necessitate self-reflection, uncertainty, and vigilance, which will be referred to as a discourse of relationality. Such a discourse diverges from the presumption of a detached author as a point of reference, and instead departs from attachment, dependence, mutuality and embraces the visceral as a resource of knowledge of relationality. A discourse of relationality points to another possibility of White engagement with Whiteness and racism and further promotes a conception of responsibility, which allows for and highlights dispossession and relationality in contrast to single agency and guilt.Keywords: abjection, discourse, relationality, the visceral, whiteness
Procedia PDF Downloads 158537 Tool for Maxillary Sinus Quantification in Computed Tomography Exams
Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina
Abstract:
The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.Keywords: maxillary sinus, support vector machine, region growing, volume quantification
Procedia PDF Downloads 504536 Lignin Valorization: Techno-Economic Analysis of Three Lignin Conversion Routes
Authors: Iris Vural Gursel, Andrea Ramirez
Abstract:
Effective utilization of lignin is an important mean for developing economically profitable biorefineries. Current literature suggests that large amounts of lignin will become available in second generation biorefineries. New conversion technologies will, therefore, be needed to carry lignin transformation well beyond combustion to produce energy, but towards high-value products such as chemicals and transportation fuels. In recent years, significant progress on catalysis has been made to improve transformation of lignin, and new catalytic processes are emerging. In this work, a techno-economic assessment of two of these novel conversion routes and comparison with more established lignin pyrolysis route were made. The aim is to provide insights into the potential performance and potential hotspots in order to guide the experimental research and ease the commercialization by early identifying cost drivers, strengths, and challenges. The lignin conversion routes selected for detailed assessment were: (non-catalytic) lignin pyrolysis as the benchmark, direct hydrodeoxygenation (HDO) of lignin and hydrothermal lignin depolymerisation. Products generated were mixed oxygenated aromatic monomers (MOAMON), light organics, heavy organics, and char. For the technical assessment, a basis design followed by process modelling in Aspen was done using experimental yields. A design capacity of 200 kt/year lignin feed was chosen that is equivalent to a 1 Mt/y scale lignocellulosic biorefinery. The downstream equipment was modelled to achieve the separation of the product streams defined. For determining external utility requirement, heat integration was considered and when possible gasses were combusted to cover heating demand. The models made were used in generating necessary data on material and energy flows. Next, an economic assessment was carried out by estimating operating and capital costs. Return on investment (ROI) and payback period (PBP) were used as indicators. The results of the process modelling indicate that series of separation steps are required. The downstream processing was found especially demanding in the hydrothermal upgrading process due to the presence of significant amount of unconverted lignin (34%) and water. Also, external utility requirements were found to be high. Due to the complex separations, hydrothermal upgrading process showed the highest capital cost (50 M€ more than benchmark). Whereas operating costs were found the highest for the direct HDO process (20 M€/year more than benchmark) due to the use of hydrogen. Because of high yields to valuable heavy organics (32%) and MOAMON (24%), direct HDO process showed the highest ROI (12%) and the shortest PBP (5 years). This process is found feasible with a positive net present value. However, it is very sensitive to the prices used in the calculation. The assessments at this stage are associated with large uncertainties. Nevertheless, they are useful for comparing alternatives and identifying whether a certain process should be given further consideration. Among the three processes investigated here, the direct HDO process was seen to be the most promising.Keywords: biorefinery, economic assessment, lignin conversion, process design
Procedia PDF Downloads 262535 On Elastic Anisotropy of Fused Filament Fabricated Acrylonitrile Butadiene Styrene Structures
Authors: Joseph Marae Djouda, Ashraf Kasmi, François Hild
Abstract:
Fused filament fabrication is one of the most widespread additive manufacturing techniques because of its low-cost implementation. Its initial development was based on part fabrication with thermoplastic materials. The influence of the manufacturing parameters such as the filament orientation through the nozzle, the deposited layer thickness, or the speed deposition on the mechanical properties of the parts has been widely experimentally investigated. It has been recorded the remarkable variations of the anisotropy in the function of the filament path during the fabrication process. However, there is a lack in the development of constitutive models describing the mechanical properties. In this study, integrated digital image correlation (I-DIC) is used for the identification of mechanical constitutive parameters of two configurations of ABS samples: +/-45° and so-called “oriented deposition.” In this last, the filament was deposited in order to follow the principal strain of the sample. The identification scheme based on the gap reduction between simulation and the experiment directly from images recorded from a single sample (single edge notched tension specimen) is developed. The macroscopic and mesoscopic analysis are conducted from images recorded in both sample surfaces during the tensile test. The elastic and elastoplastic models in isotropic and orthotropic frameworks have been established. It appears that independently of the sample configurations (filament orientation during the fabrication), the elastoplastic isotropic model gives the correct description of the behavior of samples. It is worth noting that in this model, the number of constitutive parameters is limited to the one considered in the elastoplastic orthotropic model. This leads to the fact that the anisotropy of the architectured 3D printed ABS parts can be neglected in the establishment of the macroscopic behavior description.Keywords: elastic anisotropy, fused filament fabrication, Acrylonitrile butadiene styrene, I-DIC identification
Procedia PDF Downloads 128534 “Who Will Marry Me?”: The Marital Status of Disabled Women in India
Authors: Sankalpa Satapathy
Abstract:
The stigma attached to disability is very high in India and given its patriarchal society women and their interests have always been pushed to the background. The identity of disabled women is compromised under the social construction of disability which lowers their self-esteem and hampers their development. Disability policies in India have focused on provision of educational and employment opportunities to make them economically productive members of the society. This preoccupation with the materialistic spheres of lives of the disabled has led to a neglect of the private sphere concerning intimate social relationships and motherhood. This paper seeks to bring to forefront the private lives of disabled women. Semi-structured in-depth interviews were conducted with twenty seven women with physical disability (congenital/acquired) from Odisha, a state in India. Sampling was done in a manner to include women from various strata of the society to allow meaningful analysis. In a society where paramount importance is attached to wifehood and motherhood, the chances of marriage for disabled women were very low compared to disabled men. Majority believed that marriage and having a family was meant for non disabled women and had decided against getting married. Socialization process was found to be a major factor in determining the ideas and aspirations of disabled women. They were clearly sidelined by their families on the issue of marriage. Education and employment levels did not seem to increase the appeal of disabled women to prospective suitors. But not all the women interviewed were closed to the idea of intimate relationships and marriage. Disabled women who were married or hoped to get married in future were found to have a better body image and greater self motivation. It is interesting to understand the means by which these women, who have been brought up to internalize ideas of their unattractiveness, undesirability, asexuality and inability to care, established identities which have so long been denied to them. With these stories of personal triumphs an attempt is made for reclamation of private spheres which have been abandoned by disability policies and make them gender sensitive.Keywords: disability, gender, marriage, relationships
Procedia PDF Downloads 358533 Development of a Humanized Anti-CEA Antibody for the Near Infrared Optical Imaging of Cancer
Authors: Paul J Yazaki, Michael Bouvet, John Shively
Abstract:
Surgery for solid gastrointestinal (GI) cancers such as pancreatic, colorectal, and gastric adenocarcinoma remains the mainstay of curative therapy. Complete resection of the primary tumor with negative margins (R0 resection), its draining lymph nodes, and distant metastases offers the optimal surgical benefit. Real-time fluorescence guided surgery (FGS) promises to improve GI cancer outcomes and is rapidly advancing with tumor-specific antibody conjugated fluorophores that can be imaged using near infrared (NIR) technology. Carcinoembryonic Antigen (CEA) is a non-internalizing tumor antigen validated as a surface tumor marker expressed in >95% of colorectal, 80% of gastric, and 60% of pancreatic adenocarcinomas. Our humanized anti-CEA hT84.66-M5A (M5A) monoclonal antibody (mAb)was conjugated with the NHS-IRDye800CW fluorophore and shown it can rapidly and effectively NIRoptical imageorthotopically implanted human colon and pancreatic cancer in mouse models. A limitation observed is that these NIR-800 dye conjugated mAbs have a rapid clearance from the blood, leading to a narrow timeframe for FGS and requiring high doses for effective optical imaging. We developed a novel antibody-fluorophore conjugate by incorporating a PEGylated sidearm linker to shield or mask the IR800 dye’s hydrophobicity which effectively extended the agent’s blood circulation half-life leading to increased tumor sensitivity and lowered normal hepatic uptake. We hypothesized that our unique anti-CEA linked to the fluorophore, IR800 by PEGylated sidewinder, M5A-SW-IR800 will become the next generation optical imaging agent, safe, effective, and widely applicable for intraoperative image guided surgery in CEA expressing GI cancers.Keywords: optical imaging, anti-CEA, cancer, fluorescence-guided surgery
Procedia PDF Downloads 147532 The Greek Revolution Through the Foreign Press: The Case of Newspaper the London Times in the Period 1821-1828
Authors: Euripides Antoniades
Abstract:
In 1821, the Greek Revolution movement, under the political influence that arose from the French revolution, and the corresponding movements in Italy, Germany and America, demanded the liberation of the nation and the establishment of an independent national state. Published topics in the British press regarding the Greek Revolution, focused on: a)the right of the Greeks to claim their freedom from Turkish domination in order to establish an independent state based on the principle of national autonomy, b)criticism regarding Turkish rule as illegal and the power of the Ottoman Sultan as arbitrary, c)the recognition of the Greek identity and its distinction from the Turkish one and d)the endorsement Greeks as the descendants of ancient Greeks. The London Times is a print publication that presents, in chronological or thematic order, the news, opinions or announcements about the most important events that have occurred in a place during a specified period of time. A combination of qualitative and quantitative content analysis was applied. An attempt was made to record Greek Revolution references along with the usage of specific words and expressions that contribute to the representation of the historical events and their exposure to the reading public. Key finds of this research reveal that a)there was a frequency of passionate daily articles concerning the events in Greece, their length, and context in The Times of London, b)he British public opinion was influenced by this particular newspaper and c) he newspaper published various news about the revolution by adopting the role of animator of the Greek struggle. In fact, this type of news was the main substance of the The London Times’ structure, establishing a positive image about the Greek Revolution contributing to the European diplomatic development. These factors offered a change in the attitude of the British and Russians respectively assuming a positive approach towards Greece.Keywords: Greece, revolution, press, the london times, great britain, mass media
Procedia PDF Downloads 88531 The Influence of the State on the Internal Governance of Universities: A Comparative Study of Quebec (Canada) and Western Systems
Authors: Alexandre Beaupré-Lavallée, Pier-André Bouchard St-Amant, Nathalie Beaulac
Abstract:
The question of internal governance of universities is a political and scientific debate in the province of Quebec (Canada). Governments have called or set up inquiries on the subject on three separate occasions since the complete overhaul of the educational system in the 1960s: the Parent Commission (1967), the Angers Commission (1979) and the Summit on Higher Education (2013). All three produced reports that highlight the constant tug-of-war for authority and legitimacy within universities. Past and current research that cover Quebec universities have studied several aspects regarding internal governance: the structure as a whole or only some parts of it, the importance of certain key aspects such as collegiality or strategic planning, or of stakeholders, such as students or administrators. External governance has also been studied, though, as with internal governance, research so far as only covered well delineated topics like financing policies or overall impacts from wider societal changes such as New Public Management. The latter, NPM, is often brought up as a factor that influenced overall State policies like “steering-at-a-distance” or internal shifts towards “managerialism”. Yet, to the authors’ knowledge, there is not study that specifically maps how the Quebec State formally influences internal governance. In addition, most studies about the Quebec university system are not comparative in nature. This paper presents a portion of the results produced by a 2022- 2023 study that aims at filling these last two gaps in knowledge. Building on existing governmental, institutional, and scientific papers, we documented the legal and regulatory framework of the Quebec university system and of twenty-one other university systems in North America and Europe (2 in Canada, 2 in the USA, 16 in Europe, with the addition of the European Union as a distinct case). This allowed us to map the presence (or absence) of mandatory structures of governance enforced by States, as well as their composition. Then, using Clark’s “triangle of coordination”, we analyzed each system to assess the relative influences of the market, the State and the collegium upon the governance model put in place. Finally, we compared all 21 non-Quebec systems to characterize the province’s policies in an internal perspective. Preliminary findings are twofold. First, when all systems are placed on a continuum ranging from “no State interference in internal governance” to “State-run universities”, Quebec comes in the middle of the pack, albeit with a slight lean towards institutional freedom. When it comes to overall governance (like Boards and Senates), the dual nature of the Quebec system, with its public university and its coopted yet historically private (or ecclesiastic) institutions, in fact mimics the duality of all university systems. Second, however, is the sheer abundance of legal and regulatory mandates from the State that, while not expressly addressing internal governance, seems to require de facto modification of internal governance structure and dynamics to ensure institutional conformity with said mandates. This study is only a fraction of the research that is needed to better understand State-universities interactions regarding governance. We hope it will set the stage for future studies.Keywords: internal governance, legislation, Quebec, universities
Procedia PDF Downloads 85530 Optimization of the Jatropha curcas Supply Chain as a Criteria for the Implementation of Future Collection Points in Rural Areas of Manabi-Ecuador
Authors: Boris G. German, Edward Jiménez, Sebastián Espinoza, Andrés G. Chico, Ricardo A. Narváez
Abstract:
The unique flora and fauna of The Galapagos Islands has leveraged a tourism-driven growth in the islands. Nonetheless, such development is energy-intensive and requires thousands of gallons of diesel each year for thermoelectric electricity generation. The needed transport of fossil fuels from the continent has generated oil spillages and affectations to the fragile ecosystem of the islands. The Zero Fossil Fuels initiative for The Galapagos proposed by the Ecuadorian government as an alternative to reduce the use of fossil fuels in the islands, considers the replacement of diesel in thermoelectric generators, by Jatropha curcas vegetable oil. However, the Jatropha oil supply cannot entirely cover yet the demand for electricity generation in Galapagos. Within this context, the present work aims to provide an optimization model that can be used as a selection criterion for approving new Jatropha Curcas collection points in rural areas of Manabi-Ecuador. For this purpose, existing Jatropha collection points in Manabi were grouped under three regions: north (7 collection points), center (4 collection points) and south (9 collection points). Field work was carried out in every region in order to characterize the collection points, to establish local Jatropha supply and to determine transportation costs. Data collection was complemented using GIS software and an objective function was defined in order to determine the profit associated to Jatropha oil production. The market price of both Jatropha oil and residual cake, were considered for the total revenue; whereas Jatropha price, transportation and oil extraction costs were considered for the total cost. The tonnes of Jatropha fruit and seed, transported from collection points to the extraction plant, were considered as variables. The maximum and minimum amount of the collected Jatropha from each region constrained the optimization problem. The supply chain was optimized using linear programming in order to maximize the profits. Finally, a sensitivity analysis was performed in order to find a profit-based criterion for the acceptance of future collection points in Manabi. The maximum profit reached a value of $ 4,616.93 per year, which represented a total Jatropha collection of 62.3 tonnes Jatropha per year. The northern region of Manabi had the biggest collection share (69%), followed by the southern region (17%). The criteria for accepting new Jatropha collection points in the rural areas of Manabi can be defined by the current maximum profit of the zone and by the variation in the profit when collection points are removed one at a time. The definition of new feasible collection points plays a key role in the supply chain associated to Jatropha oil production. Therefore, a mathematical model that assists decision makers in establishing new collection points while assuring profitability, contributes to guarantee a continued Jatropha oil supply for Galapagos and a sustained economic growth in the rural areas of Ecuador.Keywords: collection points, Jatropha curcas, linear programming, supply chain
Procedia PDF Downloads 434529 Effect of Silica Fume at Cellular Sprayed Concrete
Authors: Kyong-Ku Yun, Seung-Yeon Han, Kyeo-Re Lee
Abstract:
Silica fume which is a super-fine byproduct of ferrosilicon or silicon metal has a filling effect on micro-air voids or a transition zone in a hardened cement paste by appropriate mixing, placement, and curing. It, also, has a Pozzolan reaction which enhances the interior density of the hydrated cement paste through a formation of calcium silicate hydroxide. When substituting cement with silica fume, it improves water tightness and durability by filling effect and Pozzolan reaction. However, it needs high range water reducer or super-plasticizer to distribute silica fume into a concrete because of its finesses and high specific surface area. In order to distribute into concrete evenly, cement manufacturers make a pre-blended cement of silica fume and provide to a market. However, a special mixing procedures and another transportation charge another cost and this result in a high price of pre-blended cement of silica fume. The purpose of this dissertation was to investigate the dispersion of silica fume by air slurry and its effect on the mechanical properties of at ready-mixed concrete. The results are as follows: A dispersion effect of silica fume was measured from an analysis of standard deviation for compressive strength test results. It showed that the standard deviation decreased as the air bubble content increased, which means that the dispersion became better as the air bubble content increased. The test result of rapid chloride permeability test showed that permeability resistance increased as the percentages of silica fume increased, but the permeability resistance decreased as the quantity of mixing air bubble increased. The image analysis showed that a spacing factor decreased and a specific surface area increased as the quantity of mixing air bubble increased.Keywords: cellular sprayed concrete, silica fume, deviation, permeability
Procedia PDF Downloads 132528 Arthroscopic Assisted Fibertape Technique For Recurrent MPFL Reconstruction - Case Series Done In The UK Population
Authors: Naufal Ahmed, Michael Lwin
Abstract:
Background: MPFL reconstructions are ideally performed with au-tografts like gracilis semitendinosus tendon, which may be associated with donor site morbidity and complications. In this case series, we have tried to use fiber tape, which avoids the above complications and also keeps the graft virgin. This kind of synthetic graft has been used successfully in rotator cuffs and ACJ reconstructions with good results. Materials and methods: It was a retrospective data analysis of 45 patients who underwent this procedure from 2014-2020 under a single consultant in a DGH . These patiens have been followed up at 6 weeks, 6 months, 1 year, and 1 ½ years with clinical assessment and KOOS scores. We compared the results with the NJR and also with the Belgium report and was found to be satisfactory and comparable with them. Surgical technique : We used Arthrex fiber tape for the reconstruction of MPFL . Initially, two parallel holes drilled over sup aspect of the patella with help of an image intensifier, and then fiber wire passed through them from the medial to the lateral side and back to the medial side. The fiber wire was attached to the schottle point on the femoral side, giving a good extra articular internal brac-ing to the MPFL. All patients were scoped before the procedure, and the final tightening over the femoral side was done directly under vision to see the position of the patella. Results: We had 45 MPFL reconstructions along with 4 additional procedures 1 ACLR, 2 ACL REPAIR, 1 TTT advancement ( revision MPFL ). There were 14 males and 31 females, and their average age was 25 (13-55 ). We did not have any donor site morbidity, no infection, no fractures, no recurrent dislocations, no reoperations yet. Conclusion: Fiber tape is a feasible and appropriate option for MPFL reconstruction. We haven’t seen any re -operation in our 5 year follow up. This technique avoids the use of autograft, which can be used in the future if needed for revision surgeries. We don’t lose anything by following this simple novel technique.Keywords: arthroscopy, fibertape, MPFL reconstruction, recurrent patella dislocation
Procedia PDF Downloads 140527 Vision and Challenges of Developing VR-Based Digital Anatomy Learning Platforms and a Solution Set for 3D Model Marking
Authors: Gizem Kayar, Ramazan Bakir, M. Ilkay Koşar, Ceren U. Gencer, Alperen Ayyildiz
Abstract:
Anatomy classes are crucial for general education of medical students, whereas learning anatomy is quite challenging and requires memorization of thousands of structures. In traditional teaching methods, learning materials are still based on books, anatomy mannequins, or videos. This results in forgetting many important structures after several years. However, more interactive teaching methods like virtual reality, augmented reality, gamification, and motion sensors are becoming more popular since such methods ease the way we learn and keep the data in mind for longer terms. During our study, we designed a virtual reality based digital head anatomy platform to investigate whether a fully interactive anatomy platform is effective to learn anatomy and to understand the level of teaching and learning optimization. The Head is one of the most complicated human anatomy structures, with thousands of tiny, unique structures. This makes the head anatomy one of the most difficult parts to understand during class sessions. Therefore, we developed a fully interactive digital tool with 3D model marking, quiz structures, 2D/3D puzzle structures, and VR support so as to integrate the power of VR and gamification. The project has been developed in Unity game engine with HTC Vive Cosmos VR headset. The head anatomy 3D model has been selected with full skeletal, muscular, integumentary, head, teeth, lymph, and vein system. The biggest issue during the development was the complexity of our model and the marking of it in the 3D world system. 3D model marking requires to access to each unique structure in the counted subsystems which means hundreds of marking needs to be done. Some parts of our 3D head model were monolithic. This is why we worked on dividing such parts to subparts which is very time-consuming. In order to subdivide monolithic parts, one must use an external modeling tool. However, such tools generally come with high learning curves, and seamless division is not ensured. Second option was to integrate tiny colliders to all unique items for mouse interaction. However, outside colliders which cover inner trigger colliders cause overlapping, and these colliders repel each other. Third option is using raycasting. However, due to its own view-based nature, raycasting has some inherent problems. As the model rotate, view direction changes very frequently, and directional computations become even harder. This is why, finally, we studied on the local coordinate system. By taking the pivot point of the model into consideration (back of the nose), each sub-structure is marked with its own local coordinate with respect to the pivot. After converting the mouse position to the world position and checking its relation with the corresponding structure’s local coordinate, we were able to mark all points correctly. The advantage of this method is its applicability and accuracy for all types of monolithic anatomical structures.Keywords: anatomy, e-learning, virtual reality, 3D model marking
Procedia PDF Downloads 100526 Mechanical Properties of Lithium-Ion Battery at Different Packing Angles Under Impact Loading
Authors: Wei Zhao, Yuxuan Yao, Hao Chen
Abstract:
In order to find out the mechanical properties and failure behavior of lithium-ion batteries, drop hammer impact experiments and finite element simulations are carried out on batteries with different packed angles. Firstly, a drop hammer impact experiment system, which is based on the DHR-1808 drop hammer and oscilloscope, is established, and then a drop test of individual batteries and packed angles of 180 ° and 120 ° are carried out. The image of battery deformation, force-time curve and voltage-time curve are recorded. Secondly, finite element models of individual batteries and two packed angles are established, and the results of the test and simulation are compared. Finally, the mechanical characteristics and failure behavior of lithium-ion battery modules with the packed arrangement of 6 * 6 and packing angles of 180 °, 120 °, 90 ° and 60 ° are analyzed under the same velocity with different battery packing angles, and the same impact energy with different impact velocity and different packing angles. The result shows that the individual battery is destroyed completely in the drop hammer impact test with an initial impact velocity of 3m/s and drop height of 459mm, and the voltage drops to close to 0V when the test ends. The voltage drops to 12V when packed angle of 180°, and 3.6V when packed angle of 120°. It is found that the trend of the force-time curve between simulation and experiment is generally consistent. The difference in maximum peak value is 3.9kN for a packing angle of 180° and 1.3kN for a packing angle of 120°. Under the same impact velocity and impact energy, the strain rate of the battery module with a packing angle of 180° is the lowest, and the maximum stress can reach 26.7MPa with no battery short-circuited. The research under our experiment and simulation shows that the lithium-ion battery module with a packing angle of 180 ° is the least likely to be damaged, which can sustain the maximum stress under the same impact load.Keywords: battery module, finite element simulation, power battery, packing angle
Procedia PDF Downloads 73525 Prediction of Positive Cloud-to-Ground Lightning Striking Zones for Charged Thundercloud Based on Line Charge Model
Authors: Surajit Das Barman, Rakibuzzaman Shah, Apurv Kumar
Abstract:
Bushfire is known as one of the ascendant factors to create pyrocumulus thundercloud that causes the ignition of new fires by pyrocumulonimbus (pyroCb) lightning strikes and creates major losses of lives and property worldwide. A conceptual model-based risk planning would be beneficial to predict the lightning striking zones on the surface of the earth underneath the pyroCb thundercloud. PyroCb thundercloud can generate both positive cloud-to-ground (+CG) and negative cloud-to-ground (-CG) lightning in which +CG tends to ignite more bushfires and cause massive damage to nature and infrastructure. In this paper, a simple line charge structured thundercloud model is constructed in 2-D coordinates using the method of image charge to predict the probable +CG lightning striking zones on the earth’s surface for two conceptual thundercloud charge configurations: titled dipole and conventional tripole structure with excessive lower positive charge regions that lead to producing +CG lightning. The electric potential and surface charge density along the earth’s surface for both structures via continuously adjusting the position and the charge density of their charge regions is investigated. Simulation results for tilted dipole structure confirm the down-shear extension of the upper positive charge region in the direction of the cloud’s forward flank by 4 to 8 km, resulting in negative surface density, and would expect +CG lightning to strike within 7.8 km to 20 km around the earth periphery in the direction of the cloud’s forward flank. On the other hand, the conceptual tripole charge structure with enhanced lower positive charge region develops negative surface charge density on the earth’s surface in the range |x| < 6.5 km beneath the thundercloud and highly favors producing +CG lightning strikes.Keywords: pyrocumulonimbus, cloud-to-ground lightning, charge structure, surface charge density, forward flank
Procedia PDF Downloads 113524 The Meaningful Pixel and Texture: Exploring Digital Vision and Art Practice Based on Chinese Cosmotechnics
Authors: Xingdu Wang, Charlie Gere, Emma Rose, Yuxuan Zhao
Abstract:
The study introduces a fresh perspective on the digital realm through an examination of the Chinese concept of Xiang, elucidating how it can build an understanding of pixels and textures on screens as digital trigrams. This concept attempts to offer an outlook on the intersection of digital technology and the natural world, thereby contributing to discussions about the harmonious relationship between humans and technology. The study looks for the ancient Chinese theory of Xiang as a key to establishing the theories and practices to respond to the problem of Contemporary Chinese technics. Xiang is a Chinese method of understanding the essentials of things through appearances, which differs from the method of science in the Westen. Xiang, the basement of Chinese visual art, is rooted in ancient Chinese philosophy and connected to the eight trigrams. The discussion of Xiang connects art, philosophy, and technology. This paper connects the meaning of Xiang with the 'truth appearing' philosophically through the analysis of the concepts of phenomenon and noumenon and the unique Chinese way of observing. Hereafter, the historical interconnection between ancient painting and writing in China emphasizes their relationship between technical craftsmanship and artistic expression. In digital, the paper blurs the traditional boundaries between images and text on digital screens in theory. Lastly, this study identified an ensemble concept relating to pixels and textures in computer vision, drawing inspiration from AI image recognition in Chinese paintings. In art practice, by presenting a fluid visual experience in the form of pixels, which mimics the flow of lines in traditional calligraphy and painting, it is hoped that the viewer will be brought back to the process of the truth appearing as defined by the 'Xiang’.Keywords: Chinese cosmotechnics, computer vision, contemporary Neo-Confucianism, texture and pixel, Xiang
Procedia PDF Downloads 66523 Analysis of Electric Mobility in the European Union: Forecasting 2035
Authors: Domenico Carmelo Mongelli
Abstract:
The context is that of great uncertainty in the 27 countries belonging to the European Union which has adopted an epochal measure: the elimination of internal combustion engines for the traction of road vehicles starting from 2035 with complete replacement with electric vehicles. If on the one hand there is great concern at various levels for the unpreparedness for this change, on the other the Scientific Community is not preparing accurate studies on the problem, as the scientific literature deals with single aspects of the issue, moreover addressing the issue at the level of individual countries, losing sight of the global implications of the issue for the entire EU. The aim of the research is to fill these gaps: the technological, plant engineering, environmental, economic and employment aspects of the energy transition in question are addressed and connected to each other, comparing the current situation with the different scenarios that could exist in 2035 and in the following years until total disposal of the internal combustion engine vehicle fleet for the entire EU. The methodologies adopted by the research consist in the analysis of the entire life cycle of electric vehicles and batteries, through the use of specific databases, and in the dynamic simulation, using specific calculation codes, of the application of the results of this analysis to the entire EU electric vehicle fleet from 2035 onwards. Energy balance sheets will be drawn up (to evaluate the net energy saved), plant balance sheets (to determine the surplus demand for power and electrical energy required and the sizing of new plants from renewable sources to cover electricity needs), economic balance sheets (to determine the investment costs for this transition, the savings during the operation phase and the payback times of the initial investments), the environmental balances (with the different energy mix scenarios in anticipation of 2035, the reductions in CO2eq and the environmental effects are determined resulting from the increase in the production of lithium for batteries), the employment balances (it is estimated how many jobs will be lost and recovered in the reconversion of the automotive industry, related industries and in the refining, distribution and sale of petroleum products and how many will be products for technological innovation, the increase in demand for electricity, the construction and management of street electric columns). New algorithms for forecast optimization are developed, tested and validated. Compared to other published material, the research adds an overall picture of the energy transition, capturing the advantages and disadvantages of the different aspects, evaluating the entities and improvement solutions in an organic overall picture of the topic. The results achieved allow us to identify the strengths and weaknesses of the energy transition, to determine the possible solutions to mitigate these weaknesses and to simulate and then evaluate their effects, establishing the most suitable solutions to make this transition feasible.Keywords: engines, Europe, mobility, transition
Procedia PDF Downloads 63522 Improvements in Transient Testing in The Transient REActor Test (TREAT) with a Choice of Filter
Authors: Harish Aryal
Abstract:
The safe and reliable operation of nuclear reactors has always been one of the topmost priorities in the nuclear industry. Transient testing allows us to understand the time-dependent behavior of the neutron population in response to either a planned change in the reactor conditions or unplanned circumstances. These unforeseen conditions might occur due to sudden reactivity insertions, feedback, power excursions, instabilities, and accidents. To study such behavior, we need transient testing, which is like car crash testing, to estimate the durability and strength of a car design. In nuclear designs, such transient testing can simulate a wide range of accidents due to sudden reactivity insertions and helps to study the feasibility and integrity of the fuel to be used in certain reactor types. This testing involves a high neutron flux environment and real-time imaging technology with advanced instrumentation with appropriate accuracy and resolution to study the fuel slumping behavior. With the aid of transient testing and adequate imaging tools, it is possible to test the safety basis for reactor and fuel designs that serves as a gateway in licensing advanced reactors in the future. To that end, it is crucial to fully understand advanced imaging techniques both analytically and via simulations. This paper presents an innovative method of supporting real-time imaging of fuel pins and other structures during transient testing. The major fuel-motion detection device that is studied in this dissertation is the Hodoscope which requires collimators. This paper provides 1) an MCNP model and simulation of a Transient Reactor Test (TREAT) core with a central fuel element replaced by a slotted fuel element that provides an open path between test samples and a hodoscope detector and 2) a choice of good filter to improve image resolution.Keywords: hodoscope, transient testing, collimators, MCNP, TREAT, hodogram, filters
Procedia PDF Downloads 77521 Numerical Study of a Ventilation Principle Based on Flow Pulsations
Authors: Amir Sattari, Mac Panah, Naeim Rashidfarokhi
Abstract:
To enhance the mixing of fluid in a rectangular enclosure with a circular inlet and outlet, an energy-efficient approach is further investigated through computational fluid dynamics (CFD). Particle image velocimetry (PIV) measurements help confirm that the pulsation of the inflow velocity improves the mixing performance inside the enclosure considerably without increasing energy consumption. In this study, multiple CFD simulations with different turbulent models were performed. The results obtained were compared with experimental PIV results. This study investigates small-scale representations of flow patterns in a ventilated rectangular room. The objective is to validate the concept of an energy-efficient ventilation strategy with improved thermal comfort and reduction of stagnant air inside the room. Experimental and simulated results confirm that through pulsation of the inflow velocity, strong secondary vortices are generated downstream of the entrance wall-jet. The pulsatile inflow profile promotes a periodic generation of vortices with stronger eddies despite a relatively low inlet velocity, which leads to a larger boundary layer with increased kinetic energy in the occupied zone. A real-scale study was not conducted; however, it can be concluded that a constant velocity inflow profile can be replaced with a lower pulsated flow rate profile while preserving the mixing efficiency. Among the turbulent CFD models demonstrated in this study, SST-kω is most advantageous, exhibiting a similar global airflow pattern as in the experiments. The detailed near-wall velocity profile is utilized to identify the wall-jet instabilities that consist of mixing and boundary layers. The SAS method was later applied to predict the turbulent parameters in the center of the domain. In both cases, the predictions are in good agreement with the measured results.Keywords: CFD, PIV, pulsatile inflow, ventilation, wall-jet
Procedia PDF Downloads 174520 Monitoring and Improving Performance of Soil Aquifer Treatment System and Infiltration Basins Performance: North Gaza Emergency Sewage Treatment Plant as Case Study
Authors: Sadi Ali, Yaser Kishawi
Abstract:
As part of Palestine, Gaza Strip (365 km2 and 1.8 million habitants) is considered a semi-arid zone relies solely on the Coastal Aquifer. The coastal aquifer is only source of water with only 5-10% suitable for human use. This barely cover the domestic and agricultural needs of Gaza Strip. Palestinian Water Authority Strategy is to find non-conventional water resource from treated wastewater to irrigate 1500 hectares and serves over 100,000 inhabitants. A new WWTP project is to replace the old-overloaded Biet Lahia WWTP. The project consists of three parts; phase A (pressure line & 9 infiltration basins - IBs), phase B (a new WWTP) and phase C (Recovery and Reuse Scheme – RRS – to capture the spreading plume). Currently, phase A is functioning since Apr 2009. Since Apr 2009, a monitoring plan is conducted to monitor the infiltration rate (I.R.) of the 9 basins. Nearly 23 million m3 of partially treated wastewater were infiltrated up to Jun 2014. It is important to maintain an acceptable rate to allow the basins to handle the coming quantities (currently 10,000 m3 are pumped an infiltrated daily). The methodology applied was to review and analysis the collected data including the I.R.s, the WW quality and the drying-wetting schedule of the basins. One of the main findings is the relation between the Total Suspended Solids (TSS) at BLWWTP and the I.R. at the basins. Since April 2009, the basins scored an average I.R. of about 2.5 m/day. Since then the records showed a decreasing pattern of the average rate until it reached the lower value of 0.42 m/day in Jun 2013. This was accompanied with an increase of TSS (mg/L) concentration at the source reaching above 200 mg/L. The reducing of TSS concentration directly improved the I.R. (by cleaning the WW source ponds at Biet Lahia WWTP site). This was reflected in an improvement in I.R. in last 6 months from 0.42 m/day to 0.66 m/day then to nearly 1.0 m/day as the average of the last 3 months of 2013. The wetting-drying scheme of the basins was observed (3 days wetting and 7 days drying) besides the rainfall rates. Despite the difficulty to apply this scheme accurately a control of flow to each basin was applied to improve the I.R. The drying-wetting system affected the I.R. of individual basins, thus affected the overall system rate which was recorded and assessed. Also the ploughing activities at the infiltration basins as well were recommended at certain times to retain a certain infiltration level. This breaks the confined clogging layer which prevents the infiltration. It is recommended to maintain proper quality of WW infiltrated to ensure an acceptable performance of IBs. The continual maintenance of settling ponds at BLWWTP, continual ploughing of basins and applying soil treatment techniques at the IBs will improve the I.R.s. When the new WWTP functions a high standard effluent quality (TSS 20mg, BOD 20 mg/l and TN 15 mg/l) will be infiltrated, thus will enhance I.R.s of IBs due to lower organic load.Keywords: SAT, wastewater quality, soil remediation, North Gaza
Procedia PDF Downloads 234519 Development of a Model for Predicting Radiological Risks in Interventional Cardiology
Authors: Stefaan Carpentier, Aya Al Masri, Fabrice Leroy, Thibault Julien, Safoin Aktaou, Malorie Martin, Fouad Maaloul
Abstract:
Introduction: During an 'Interventional Radiology (IR)' procedure, the patient's skin-dose may become very high for a burn, necrosis, and ulceration to appear. In order to prevent these deterministic effects, a prediction of the peak skin-dose for the patient is important in order to improve the post-operative care to be given to the patient. The objective of this study is to estimate, before the intervention, the patient dose for ‘Chronic Total Occlusion (CTO)’ procedures by selecting relevant clinical indicators. Materials and methods: 103 procedures were performed in the ‘Interventional Cardiology (IC)’ department using a Siemens Artis Zee image intensifier that provides the Air Kerma of each IC exam. Peak Skin Dose (PSD) was measured for each procedure using radiochromic films. Patient parameters such as sex, age, weight, and height were recorded. The complexity index J-CTO score, specific to each intervention, was determined by the cardiologist. A correlation method applied to these indicators allowed to specify their influence on the dose. A predictive model of the dose was created using multiple linear regressions. Results: Out of 103 patients involved in the study, 5 were excluded for clinical reasons and 2 for placement of radiochromic films outside the exposure field. 96 2D-dose maps were finally used. The influencing factors having the highest correlation with the PSD are the patient's diameter and the J-CTO score. The predictive model is based on these parameters. The comparison between estimated and measured skin doses shows an average difference of 0.85 ± 0.55 Gy for doses of less than 6 Gy. The mean difference between air-Kerma and PSD is 1.66 Gy ± 1.16 Gy. Conclusion: Using our developed method, a first estimate of the dose to the skin of the patient is available before the start of the procedure, which helps the cardiologist in carrying out its intervention. This estimation is more accurate than that provided by the Air-Kerma.Keywords: chronic total occlusion procedures, clinical experimentation, interventional radiology, patient's peak skin dose
Procedia PDF Downloads 138