Search results for: features extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5424

Search results for: features extraction

444 An Informative Marketing Platform: Methodology and Architecture

Authors: Martina Marinelli, Samanta Vellante, Francesco Pilotti, Daniele Di Valerio, Gaetanino Paolone

Abstract:

Any development in web marketing technology requires changes in information engineering to identify instruments and techniques suitable for the production of software applications for informative marketing. Moreover, for large web solutions, designing an interface that enables human interactions is a complex process that must bridge between informative marketing requirements and the developed solution. A user-friendly interface in web marketing applications is crucial for a successful business. The paper introduces mkInfo - a software platform that implements informative marketing. Informative marketing is a new interpretation of marketing which places the information at the center of every marketing action. The creative team includes software engineering researchers who have recently authored an article on automatic code generation. The authors have created the mkInfo software platform to generate informative marketing web applications. For each web application, it is possible to automatically implement an opt in page, a landing page, a sales page, and a thank you page: one only needs to insert the content. mkInfo implements an autoresponder to send mail according to a predetermined schedule. The mkInfo platform also includes e-commerce for a product or service. The stakeholder can access any opt-in page and get basic information about a product or service. If he wants to know more, he will need to provide an e-mail address to access a landing page that will generate an e-mail sequence. It will provide him with complete information about the product or the service. From this point on, the stakeholder becomes a user and is now able to purchase the product or related services through the mkInfo platform. This paper suggests a possible definition for Informative Marketing, illustrates its basic principles, and finally details the mkInfo platform that implements it. This paper also offers some Informative Marketing models, which are implemented in the mkInfo platform. Informative marketing can be applied to products or services. It is necessary to realize a web application for each product or service. The mkInfo platform enables the product or the service producer to send information concerning a specific product or service to all stakeholders. In conclusion, the technical contributions of this paper are: a different interpretation of marketing based on information; a modular architecture for web applications, particularly for one with standard features such as information storage, exchange, and delivery; multiple models to implement informative marketing; a software platform enabling the implementation of such models in a web application. Future research aims to enable stakeholders to provide information about a product or a service so that the information gathered about a product or a service includes both the producer’s and the stakeholders' point of view. The purpose is to create an all-inclusive management system of the knowledge regarding a specific product or service: a system that includes everything about the product or service and is able to address even unexpected questions.

Keywords: informative marketing, opt in page, software platform, web application

Procedia PDF Downloads 112
443 Momentum in the Stock Exchange of Thailand

Authors: Mussa Hussaini, Supasith Chonglerttham

Abstract:

Stocks are usually classified according to their characteristics which are unique enough such that the performance of each category can be differentiated from another. The reasons behind such classifications in the financial market are sometimes financial innovation or it can also be because of finding a premium in a group of stocks with similar features. One of the major classifications in stocks market is called momentum strategy. Based on this strategy stocks are classified according to their past performances into past winners and past losers. Momentum in a stock market refers to the idea that stocks will keep moving in the same direction. In other word, stocks with rising prices (past winners stocks) will continue to rise and those stocks with falling prices (past losers stocks) will continue to fall. The performance of this classification has been well documented in numerous studies in different countries. These studies suggest that past winners tend to outperform past losers in the future. However, academic research in this direction has been limited in countries such as Thailand and to the best of our knowledge, there has been no such study in Thailand after the financial crisis of 1997. The significance of this study stems from the fact that Thailand is an open market and has been encouraging foreign investments as one of the means to enhance employment, promote economic development, and technology transfer and the main equity market in Thailand, the Stock Exchange of Thailand is a crucial channel for Foreign Investment inflow into the country. The equity market size in Thailand increased from $1.72 billion in 1984 to $133.66 billion in 1993, an increase of over 77 times within a decade. The main contribution of this paper is evidence for size category in the context of the equity market in Thailand. Almost all previous studies have focused solely on large stocks or indices. This paper extends the scope beyond large stocks and indices by including small and tiny stocks as well. Further, since there is a distinct absence of detailed academic research on momentum strategy in the Stock Exchange of Thailand after the crisis, this paper also contributes to the extension of existing literature of the study. This research is also of significance for those researchers who would like to compare the performance of this strategy in different countries and markets. In the Stock Exchange of Thailand, we examined the performance of momentum strategy from 2010 to 2014. Returns on portfolios are calculated on monthly basis. Our results on momentum strategy confirm that there is positive momentum profit in large size stocks whereas there is negative momentum profit in small size stocks during the period of 2010 to 2014. Furthermore, the equal weighted average of momentum profit of both small and large size category do not provide any indication of overall momentum profit.

Keywords: momentum strategy, past loser, past winner, stock exchange of Thailand

Procedia PDF Downloads 302
442 Gross and Clinical Anatomy of the Skull of Adult Chinkara, Gazella bennettii

Authors: Salahud Din, Saima Masood, Hafsa Zaneb, Habib Ur Rehman, Saima Ashraf, Imad Khan, Muqader Shah

Abstract:

The objective of this study was (1) to study gross morphological, osteometric and clinical important landmarks in the skull of adult Chinkara to obtain baseline data and (2) to study sexual dimorphism in male and female adult Chinkara through osteometry. For this purpose, after performing postmortem examination, the carcass of adult Chinkara of known sex and age was buried in the locality of the Manglot Wildlife Park and Ungulate Breeding Centre, Nizampur, Pakistan; after a specific period of time, the bones were unearthed. Gross morphological features and various osteometric parameters of the skull were studied in the University of Veterinary and Animal Sciences, Lahore, Pakistan. The shape of the Chinkara skull was elongated and had thirty-two bones. The skull was comprised of the cranial and the facial part. The facial region of the skull was formed by maxilla, incisive, palatine, vomar, pterygoid, frontal, parietal, nasal, incisive, turbinates, mandible and hyoid apparatus. The bony region of the cranium of Chinkara was comprised of occipital, ethmoid, sphenoid, interparietal, parietal, temporal, and frontal bone. The foramina identified in the facial region of the skull of Chinkara were infraorbital, supraorbital foramen, lacrimal, sphenopalatine, maxillary and caudal palatine foramina. The foramina of the cranium of the skull of the Chinkara were the internal acoustic meatus, external acoustic meatus, hypoglossal canal, transverse canal, sphenorbital fissure, carotid canal, foramen magnum, stylomastoid foramen, foramen rotundum, foramen ovale and jugular foramen, and the rostral and the caudal foramina that formed the pterygoid canal. The measured craniometric parameters did not show statistically significant differences (p > 0.05) between male and female adult Chinkara except Palatine bone, OI, DO, IOCDE, OCT, ICW, IPCW, and PCPL were significantly higher (p > 0.05) in male than female Chinkara and mean values of the mandibular parameters except b and h were significantly (p < 0.5) higher in male Chinkara than female Chinkara. Sexual dimorphism exists in some of the orbital and foramen magnum parameters, while high levels of sexual dimorphism identified in mandible. In conclusion, morphocraniometric studies of Chinkara skull made it possible to identify species-specific skull and use clinical measurements during practical application.

Keywords: Chinkara, skull, morphology, morphometrics, sexual dimorphism

Procedia PDF Downloads 271
441 Integrating Computer-Aided Manufacturing and Computer-Aided Design for Streamlined Carpentry Production in Ghana

Authors: Benson Tette, Thomas Mensah

Abstract:

As a developing country, Ghana has a high potential to harness the economic value of every industry. Two of the industries that produce below capacity are handicrafts (for instance, carpentry) and information technology (i.e., computer science). To boost production and maintain competitiveness, the carpentry sector in Ghana needs more effective manufacturing procedures that are also more affordable. This issue can be resolved using computer-aided manufacturing (CAM) technology, which automates the fabrication process and decreases the amount of time and labor needed to make wood goods. Yet, the integration of CAM in carpentry-related production is rarely explored. To streamline the manufacturing process, this research investigates the equipment and technology that are currently used in the Ghanaian carpentry sector for automated fabrication. The research looks at the various CAM technologies, such as Computer Numerical Control routers, laser cutters, and plasma cutters, that are accessible to Ghanaian carpenters yet unexplored. We also investigate their potential to enhance the production process. To achieve the objective, 150 carpenters, 15 software engineers, and 10 policymakers were interviewed using structured questionnaires. The responses provided by the 175 respondents were processed to eliminate outliers and omissions were corrected using multiple imputations techniques. The processed responses were analyzed through thematic analysis. The findings showed that adaptation and integration of CAD software with CAM technologies would speed up the design-to-manufacturing process for carpenters. It must be noted that achieving such results entails first; examining the capabilities of current CAD software, then determining what new functions and resources are required to improve the software's suitability for carpentry tasks. Responses from both carpenters and computer scientists showed that it is highly practical and achievable to streamline the design-to-manufacturing process through processes such as modifying and combining CAD software with CAM technology. Making the carpentry-software integration program more useful for carpentry projects would necessitate investigating the capabilities of the current CAD software and identifying additional features in the Ghanaian ecosystem and tools that are required. In conclusion, the Ghanaian carpentry sector has a chance to increase productivity and competitiveness through the integration of CAM technology with CAD software. Carpentry companies may lower labor costs and boost production capacity by automating the fabrication process, giving them a competitive advantage. This study offers implementation-ready and representative recommendations for successful implementation as well as important insights into the equipment and technologies available for automated fabrication in the Ghanaian carpentry sector.

Keywords: carpentry, computer-aided manufacturing (CAM), Ghana, information technology(IT)

Procedia PDF Downloads 72
440 Numerical Simulation of Seismic Process Accompanying the Formation of Shear-Type Fault Zone in Chuya-Kuray Depressions

Authors: Mikhail O. Eremin

Abstract:

Seismic activity around the world is clearly a threat to people's lives, as well as infrastructure and capital construction. It is the instability of the latter to powerful earthquakes that most often causes human casualties. Therefore, during construction it is necessary to take into account the risks of large-scale natural disasters. The task of assessing the risks of natural disasters is one of the most urgent at the present time. The final goal of any study of earthquakes is forecasting. This is especially important for seismically active regions of the planet where earthquakes occur frequently. Gorni Altai is one of such regions. In work, we developed the physical-mathematical model of stress-strain state evolution of loaded geomedium with the purpose of numerical simulation of seismic process accompanying the formation of Chuya-Kuray fault zone Gorni Altay, Russia. We build a structural model on the base of seismotectonic and paleoseismogeological investigations, as well as SRTM-data. Base of mathematical model is the system of equations of solid mechanics which includes the fundamental conservation laws and constitutive equations for elastic (Hooke's law) and inelastic deformation (modified model of Drucker-Prager-Nikolaevskii). An initial stress state of the model correspond to gravitational. Then we simulate an activation of a buried dextral strike-slip paleo-fault located in the basement of the model. We obtain the stages of formation and the structure of Chuya-Kuray fault zone. It is shown that results of numerical simulation are in good agreement with field observations in statistical sense. Simulated seismic process is strongly bound to the faults - lineaments with high degree of inelastic strain localization. Fault zone represents en-echelon system of dextral strike-slips according to the Riedel model. The system of surface lineaments is represented with R-, R'-shear bands, X- and Y-shears, T-fractures. Simulated seismic process obeys the laws of Gutenberg-Richter and Omori. Thus, the model describes a self-similar character of deformation and fracture of rocks and geomedia. We also modified the algorithm of determination of separate slip events in the model due to the features of strain rates dependence vs time.

Keywords: Drucker-Prager model, fault zone, numerical simulation, Riedel bands, seismic process, strike-slip fault

Procedia PDF Downloads 125
439 Evolution of Web Development Progress in Modern Information Technology

Authors: Abdul Basit Kiani

Abstract:

Web development, the art of creating and maintaining websites, has witnessed remarkable advancements. The aim is to provide an overview of some of the cutting-edge developments in the field. Firstly, the rise of responsive web design has revolutionized user experiences across devices. With the increasing prevalence of smartphones and tablets, web developers have adapted to ensure seamless browsing experiences, regardless of screen size. This progress has greatly enhanced accessibility and usability, catering to the diverse needs of users worldwide. Additionally, the evolution of web frameworks and libraries has significantly streamlined the development process. Tools such as React, Angular, and Vue.js have empowered developers to build dynamic and interactive web applications with ease. These frameworks not only enhance efficiency but also bolster scalability, allowing for the creation of complex and feature-rich web solutions. Furthermore, the emergence of progressive web applications (PWAs) has bridged the gap between native mobile apps and web development. PWAs leverage modern web technologies to deliver app-like experiences, including offline functionality, push notifications, and seamless installation. This innovation has transformed the way users interact with websites, blurring the boundaries between traditional web and mobile applications. Moreover, the integration of artificial intelligence (AI) and machine learning (ML) has opened new horizons in web development. Chatbots, intelligent recommendation systems, and personalization algorithms have become integral components of modern websites. These AI-powered features enhance user engagement, provide personalized experiences, and streamline customer support processes, revolutionizing the way businesses interact with their audiences. Lastly, the emphasis on web security and privacy has been a pivotal area of progress. With the increasing incidents of cyber threats, web developers have implemented robust security measures to safeguard user data and ensure secure transactions. Innovations such as HTTPS protocol, two-factor authentication, and advanced encryption techniques have bolstered the overall security of web applications, fostering trust and confidence among users. Hence, recent progress in web development has propelled the industry forward, enabling developers to craft innovative and immersive digital experiences. From responsive design to AI integration and enhanced security, the landscape of web development continues to evolve, promising a future filled with endless possibilities.

Keywords: progressive web applications (PWAs), web security, machine learning (ML), web frameworks, advancement responsive web design

Procedia PDF Downloads 39
438 Biosensor for Determination of Immunoglobulin A, E, G and M

Authors: Umut Kokbas, Mustafa Nisari

Abstract:

Immunoglobulins, also known as antibodies, are glycoprotein molecules produced by activated B cells that transform into plasma cells and result in them. Antibodies are critical molecules of the immune response to fight, which help the immune system specifically recognize and destroy antigens such as bacteria, viruses, and toxins. Immunoglobulin classes differ in their biological properties, structures, targets, functions, and distributions. Five major classes of antibodies have been identified in mammals: IgA, IgD, IgE, IgG, and IgM. Evaluation of the immunoglobulin isotype can provide a useful insight into the complex humoral immune response. Evaluation and knowledge of immunoglobulin structure and classes are also important for the selection and preparation of antibodies for immunoassays and other detection applications. The immunoglobulin test measures the level of certain immunoglobulins in the blood. IgA, IgG, and IgM are usually measured together. In this way, they can provide doctors with important information, especially regarding immune deficiency diseases. Hypogammaglobulinemia (HGG) is one of the main groups of primary immunodeficiency disorders. HGG is caused by various defects in B cell lineage or function that result in low levels of immunoglobulins in the bloodstream. This affects the body's immune response, causing a wide range of clinical features, from asymptomatic diseases to severe and recurrent infections, chronic inflammation and autoimmunity Transient infant hypogammaglobulinemia (THGI), IgM deficiency (IgMD), Bruton agammaglobulinemia, IgA deficiency (SIgAD) HGG samples are a few. Most patients can continue their normal lives by taking prophylactic antibiotics. However, patients with severe infections require intravenous immune serum globulin (IVIG) therapy. The IgE level may rise to fight off parasitic infections, as well as a sign that the body is overreacting to allergens. Also, since the immune response can vary with different antigens, measuring specific antibody levels also aids in the interpretation of the immune response after immunization or vaccination. Immune deficiencies usually occur in childhood. In Immunology and Allergy clinics, apart from the classical methods, it will be more useful in terms of diagnosis and follow-up of diseases, if it is fast, reliable and especially in childhood hypogammaglobulinemia, sampling from children with a method that is more convenient and uncomplicated. The antibodies were attached to the electrode surface via the poly hydroxyethyl methacrylamide cysteine nanopolymer. It was used to evaluate the anodic peak results obtained in the electrochemical study. According to the data obtained, immunoglobulin determination can be made with a biosensor. However, in further studies, it will be useful to develop a medical diagnostic kit with biomedical engineering and to increase its sensitivity.

Keywords: biosensor, immunosensor, immunoglobulin, infection

Procedia PDF Downloads 75
437 Research on the Evolution of Public Space in Tourism-Oriented Traditional Rural Settlements

Authors: Yu Zhang, Mingxue Lang, Li Dong

Abstract:

The hundreds of years of slow succession of living environment in rural area is a crucial carrier of China’s long history of culture and national wisdom. In recent years, the space evolution of traditional rural settlements has been promoted by the intervention of tourism development, among which the public architecture and outdoor activity areas together served as the major places for villagers, and tourists’ social activities are an important characterization for settlement spatial evolution. Traditional public space upgrade and layout study of new public space can effectively promote the tourism industry development of traditional rural settlements. This article takes Qi County, one China Traditional Culture Village as the exemplification and uses the technology of Remote Sensing (RS), Geographic Information System (GIS) and Space Syntax, studies the evolution features of public space of tourism-oriented traditional rural settlements in four steps. First, acquire the 2003 and 2016 image data of Qi County, using the remote sensing application EDRAS8.6. Second, vectorize the basic maps of Qi County including its land use map with the application of ArcGIS 9.3 meanwhile, associating with architectural and site information concluded from field research. Third, analyze the accessibility and connectivity of the inner space of settlements using space syntax; run cross-correlation with the public space data of 2003 and 2016. Finally, summarize the evolution law of the public space of settlements; study the upgrade pattern of traditional public space and location plan for new public space. Major findings of this paper including: first, location layout of traditional public space has a larger association with the calculation results of space syntax and further confirmed the objective value of space syntax in expressing the space and social relations. Second, the intervention of tourism development generates remarkable impact on public space location of tradition rural settlements. Third, traditional public space produces the symbols of both strengthening and decline and forms a diversified upgrade pattern for the purpose of meeting the different tourism functional needs. Finally, space syntax provides an objective basis for location plan of new public space that meets the needs of tourism service. Tourism development has a significant impact on the evolution of public space of traditional rural settlements. Two types of public space, architecture, and site are both with changes seen from the perspective of quantity, location, dimension and function after the intervention of tourism development. Function upgrade of traditional public space and scientific layout of new public space are two important ways in achieving the goal of sustainable development of tourism-oriented traditional rural settlements.

Keywords: public space evolution, Qi county, space syntax, tourism oriented, traditional rural settlements

Procedia PDF Downloads 324
436 Evaluating the Service Quality and Customers’ Satisfaction for Lihpaoland in Taiwan

Authors: Wan-Yu Liu, Tiffany April Lin, Yu-Chieh Tang, Yi-Lin Wang, Chieh-Hui Li

Abstract:

As the national income in Taiwan has been raised, the life style of the public has also been changed, so that the tourism industry gradually moves from a service industry to an experience economy. The Lihpaoland is one of the most popular theme parks in Taiwan. However, the related works on performance of service quality of the park have been lacking since its re-operation in 2012. Therefore, this study investigates the quality of software/hardware facilities and services of the Lihpaoland, and aims to achieve the following three goals: 1) analyzing how various sample data of tourists leads to different results for service quality of LihpaoLand; 2) analyzing how tourists respond to the service tangibility, service reliability, service responsiveness, service guarantee, and service empathy of LihpaoLand; 3) according to the theoretical and empirical results, proposing how to improve the overall facilities and services of LihpaoLand, and hoping to provide suggestions to the LihpaoLand or other related businesses to make decision. The survey was conducted on the tourists to the LihpaoLand using convenience sampling, and 400 questionnaires were collected successfully. Analysis results show that tourists paid much attention to maintenance of amusement facilities and safety of the park, and were satisfied with them, which are great advantages of the park. However, transportation around the LihpaoLand was inadequate, and the price of the Fullon hotel (which is the hotel closest to the LihpaoLand) were not accepted by tourists – more promotion events are recommended. Additionally, the shows are not diversified, and should be improved with the highest priority. Tourists did not pay attention to service personnel’s clothing and the ticket price, but they were not satisfied with them. Hence, this study recommends to design more distinctive costumes and conduct ticket promotions. Accordingly, the suggestions made in this study for LihpaoLand are stated as follows: 1) Diversified amusement facilities should be provided to satisfy the needs at different ages. 2) Cheep but tasty catering and more distinctive souvenirs should be offered. 3) Diversified propaganda schemes should be strengthened to increase number of tourists. 4) Quality and professional of the service staff should be enhanced to acquire public praise and tourists revisiting. 5) Ticket promotions in peak seasons, low seasons, and special events should be conducted. 6) Proper traffic flows should be planned and combined with technologies to reduce waiting time of tourists. 7) The features of theme landscape in LihpaoLand should be strengthened to increase willingness of the tourists with special preferences to visit the park. 8) Ticket discounts or premier points card promotions should be adopted to reward the tourists with high loyalty.

Keywords: service quality, customers’ satisfaction, theme park, Taiwan

Procedia PDF Downloads 452
435 Application of Metaverse Service to Construct Nursing Education Theory and Platform in the Post-pandemic Era

Authors: Chen-Jung Chen, Yi-Chang Chen

Abstract:

While traditional virtual reality and augmented reality only allow for small movement learning and cannot provide a truly immersive teaching experience to give it the illusion of movement, the new technology of both content creation and immersive interactive simulation of the metaverse can just reach infinite close to the natural teaching situation. However, the mixed reality virtual classroom of metaverse has not yet explored its theory, and it is rarely implemented in the situational simulation teaching of nursing education. Therefore, in the first year, the study will intend to use grounded theory and case study methods and in-depth interviews with nursing education and information experts. Analyze the interview data to investigate the uniqueness of metaverse development. The proposed analysis will lead to alternative theories and methods for the development of nursing education. In the second year, it will plan to integrate the metaverse virtual situation simulation technology into the alternate teaching strategy in the pediatric nursing technology course and explore the nursing students' use of this teaching method as the construction of personal technology and experience. By leveraging the unique features of distinct teaching platforms and developing processes to deliver alternative teaching strategies in a nursing technology teaching environment. The aim is to increase learning achievements without compromising teaching quality and teacher-student relationships in the post-pandemic era. A descriptive and convergent mixed methods design will be employed. Sixty third-grade nursing students will be recruited to participate in the research and complete the pre-test. The students in the experimental group (N=30) agreed to participate in 4 real-time mixed virtual situation simulation courses in self-practice after class and conducted qualitative interviews after each 2 virtual situation courses; the control group (N=30) adopted traditional practice methods of self-learning after class. Both groups of students took a post-test after the course. Data analysis will adopt descriptive statistics, paired t-tests, one-way analysis of variance, and qualitative content analysis. This study addresses key issues in the virtual reality environment for teaching and learning within the metaverse, providing valuable lessons and insights for enhancing the quality of education. The findings of this study are expected to contribute useful information for the future development of digital teaching and learning in nursing and other practice-based disciplines.

Keywords: metaverse, post-pandemic era, online virtual classroom, immersive teaching

Procedia PDF Downloads 45
434 The Neoliberal Social-Economic Development and Values in the Baltic States

Authors: Daiva Skuciene

Abstract:

The Baltic States turned to free market and capitalism after independency. The new socioeconomic system, democracy and priorities about the welfare of citizens formed. The researches show that Baltic states choose the neoliberal development. Related to this neoliberal path, a few questions arouse: how do people evaluate the results of such policy and socioeconomic development? What are their priorities? And what are the values of the Baltic societies that support neoliberal policy? The purpose of this research – to analyze the socioeconomic context and the priorities and the values of the Baltics societies related to neoliberal regime. The main objectives are: firstly, to analyze the neoliberal socioeconomic features and results; secondly, to analyze people opinions and priorities about the results of neoliberal development; thirdly, to analyze the values of the Baltic societies related to the neoliberal policy. For the implementation of the purpose and objectives, the comparative analyses among European countries are used. The neoliberal regime was defined through two indicators: the taxes on capital income and expenditures on social protection. The socioeconomic outcomes of neoliberal welfare regime are defined through the Gini inequality and at risk of the poverty rate. For this analysis, the data of 2002-2013 of Eurostat were used. For the analyses of opinion about inequality and preferences on society, people want to live in, the preferences for distribution between capital and wages in enterprise data of Eurobarometer in 2010-2014 and the data of representative survey in the Baltic States in 2016 were used. The justice variable was selected as a variable reflecting the evaluation of socioeconomic context and analyzed using data of Eurobarometer 2006-2015. For the analyses of values were selected: solidarity, equality, and individual responsibility. The solidarity, equality was analyzed using data of Eurobarometer 2006-2015. The value “individual responsibility” was examined by opinions about reasons of inequality and poverty. The survey of population in the Baltic States in 2016 and data of Eurobarometer were used for this aim. The data are ranged in descending order for understanding the position of opinion of people in the Baltic States among European countries. The dynamics of indicators is also provided to examine stability of values. The main findings of the research are that people in the Baltics are dissatisfied with the results of the neoliberal socioeconomic development, they have priorities for equality and justice, but they have internalized the main neoliberal narrative- individual responsibility. The impact of socioeconomic context on values is huge, resulting in a change in quite stable opinions and values during the period of the financial crisis.

Keywords: neoliberal, inequality and poverty, solidarity, individual responsibility

Procedia PDF Downloads 238
433 Predicting Resistance of Commonly Used Antimicrobials in Urinary Tract Infections: A Decision Tree Analysis

Authors: Meera Tandan, Mohan Timilsina, Martin Cormican, Akke Vellinga

Abstract:

Background: In general practice, many infections are treated empirically without microbiological confirmation. Understanding susceptibility of antimicrobials during empirical prescribing can be helpful to reduce inappropriate prescribing. This study aims to apply a prediction model using a decision tree approach to predict the antimicrobial resistance (AMR) of urinary tract infections (UTI) based on non-clinical features of patients over 65 years. Decision tree models are a novel idea to predict the outcome of AMR at an initial stage. Method: Data was extracted from the database of the microbiological laboratory of the University Hospitals Galway on all antimicrobial susceptibility testing (AST) of urine specimens from patients over the age of 65 from January 2011 to December 2014. The primary endpoint was resistance to common antimicrobials (Nitrofurantoin, trimethoprim, ciprofloxacin, co-amoxiclav and amoxicillin) used to treat UTI. A classification and regression tree (CART) model was generated with the outcome ‘resistant infection’. The importance of each predictor (the number of previous samples, age, gender, location (nursing home, hospital, community) and causative agent) on antimicrobial resistance was estimated. Sensitivity, specificity, negative predictive (NPV) and positive predictive (PPV) values were used to evaluate the performance of the model. Seventy-five percent (75%) of the data were used as a training set and validation of the model was performed with the remaining 25% of the dataset. Results: A total of 9805 UTI patients over 65 years had their urine sample submitted for AST at least once over the four years. E.coli, Klebsiella, Proteus species were the most commonly identified pathogens among the UTI patients without catheter whereas Sertia, Staphylococcus aureus; Enterobacter was common with the catheter. The validated CART model shows slight differences in the sensitivity, specificity, PPV and NPV in between the models with and without the causative organisms. The sensitivity, specificity, PPV and NPV for the model with non-clinical predictors was between 74% and 88% depending on the antimicrobial. Conclusion: The CART models developed using non-clinical predictors have good performance when predicting antimicrobial resistance. These models predict which antimicrobial may be the most appropriate based on non-clinical factors. Other CART models, prospective data collection and validation and an increasing number of non-clinical factors will improve model performance. The presented model provides an alternative approach to decision making on antimicrobial prescribing for UTIs in older patients.

Keywords: antimicrobial resistance, urinary tract infection, prediction, decision tree

Procedia PDF Downloads 239
432 Contribution at Dimensioning of the Energy Dissipation Basin

Authors: M. Aouimeur

Abstract:

The environmental risks of a dam and particularly the security in the Valley downstream of it,, is a very complex problem. Integrated management and risk-sharing become more and more indispensable. The definition of "vulnerability “concept can provide assistance to controlling the efficiency of protective measures and the characterization of each valley relatively to the floods's risk. Security can be enhanced through the integrated land management. The social sciences may be associated to the operational systems of civil protection, in particular warning networks. The passage of extreme floods in the site of the dam causes the rupture of this structure and important damages downstream the dam. The river bed could be damaged by erosion if it is not well protected. Also, we may encounter some scouring and flooding problems in the downstream area of the dam. Therefore, the protection of the dam is crucial. It must have an energy dissipator in a specific place. The basin of dissipation plays a very important role for the security of the dam and the protection of the environment against floods downstream the dam. It allows to dissipate the potential energy created by the dam with the passage of the extreme flood on the weir and regularize in a natural manner and with more security the discharge or elevation of the water plan on the crest of the weir, also it permits to reduce the speed of the flow downstream the dam, in order to obtain an identical speed to the river bed. The problem of the dimensioning of a classic dissipation basin is in the determination of the necessary parameters for the dimensioning of this structure. This communication presents a simple graphical method, that is fast and complete, and a methodology which determines the main features of the hydraulic jump, necessary parameters for sizing the classic dissipation basin. This graphical method takes into account the constraints imposed by the reality of the terrain or the practice such as the one related to the topography of the site, the preservation of the environment equilibrium and the technical and economic side.This methodology is to impose the loss of head DH dissipated by the hydraulic jump as a hypothesis (free design) to determine all the others parameters of classical dissipation basin. We can impose the loss of head DH dissipated by the hydraulic jump that is equal to a selected value or to a certain percentage of the upstream total head created by the dam. With the parameter DH+ =(DH/k),(k: critical depth),the elaborate graphical representation allows to find the other parameters, the multiplication of these parameters by k gives the main characteristics of the hydraulic jump, necessary parameters for the dimensioning of classic dissipation basin.This solution is often preferred for sizing the dissipation basins of small concrete dams. The results verification and their comparison to practical data, confirm the validity and reliability of the elaborate graphical method.

Keywords: dimensioning, energy dissipation basin, hydraulic jump, protection of the environment

Procedia PDF Downloads 566
431 Designing Presentational Writing Assessments for the Advanced Placement World Language and Culture Exams

Authors: Mette Pedersen

Abstract:

This paper outlines the criteria that assessment specialists use when they design the 'Persuasive Essay' task for the four Advanced Placement World Language and Culture Exams (AP French, German, Italian, and Spanish). The 'Persuasive Essay' is a free-response, source-based, standardized measure of presentational writing. Each 'Persuasive Essay' item consists of three sources (an article, a chart, and an audio) and a prompt, which is a statement of the topic phrased as an interrogative sentence. Due to its richness of source materials and due to the amount of time that test takers are given to prepare for and write their responses (a total of 55 minutes), the 'Persuasive Essay' is the free-response task on the AP World Language and Culture Exams that goes to the greatest lengths to unleash the test takers' proficiency potential. The author focuses on the work that goes into designing the 'Persuasive Essay' task, outlining best practices for the selection of topics and sources, the interplay that needs to be present among the sources and the thinking behind the articulation of prompts for the 'Persuasive Essay' task. Using released 'Persuasive Essay' items from the AP World Language and Culture Exams and accompanying data on test taker performance, the author shows how different passages, and features of passages, have succeeded (and sometimes not succeeded) in eliciting writing proficiency among test takers over time. Data from approximately 215.000 test takers per year from 2014 to 2017 and approximately 35.000 test takers per year from 2012 to 2013 form the basis of this analysis. The conclusion of the study is that test taker performance improves significantly when the sources that test takers are presented with express directly opposing viewpoints. Test taker performance also improves when the interrogative prompt that the test takers respond to is phrased as a yes/no question. Finally, an analysis of linguistic difficulty and complexity levels of the printed sources reveals that test taker performance does not decrease when the complexity level of the article of the 'Persuasive Essay' increases. This last text complexity analysis is performed with the help of the 'ETS TextEvaluator' tool and the 'Complexity Scale for Information Texts (Scale)', two tools, which, in combination, provide a rubric and a fully-automated technology for evaluating nonfiction and informational texts in English translation.

Keywords: advanced placement world language and culture exams, designing presentational writing assessments, large-scale standardized assessments of written language proficiency, source-based language testing

Procedia PDF Downloads 121
430 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques

Authors: Stefan K. Behfar

Abstract:

The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.

Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing

Procedia PDF Downloads 53
429 Connecting MRI Physics to Glioma Microenvironment: Comparing Simulated T2-Weighted MRI Models of Fixed and Expanding Extracellular Space

Authors: Pamela R. Jackson, Andrea Hawkins-Daarud, Cassandra R. Rickertsen, Kamala Clark-Swanson, Scott A. Whitmire, Kristin R. Swanson

Abstract:

Glioblastoma Multiforme (GBM), the most common primary brain tumor, often presents with hyperintensity on T2-weighted or T2-weighted fluid attenuated inversion recovery (T2/FLAIR) magnetic resonance imaging (MRI). This hyperintensity corresponds with vasogenic edema, however there are likely many infiltrating tumor cells within the hyperintensity as well. While MRIs do not directly indicate tumor cells, MRIs do reflect the microenvironmental water abnormalities caused by the presence of tumor cells and edema. The inherent heterogeneity and resulting MRI features of GBMs complicate assessing disease response. To understand how hyperintensity on T2/FLAIR MRI may correlate with edema in the extracellular space (ECS), a multi-compartmental MRI signal equation which takes into account tissue compartments and their associated volumes with input coming from a mathematical model of glioma growth that incorporates edema formation was explored. The reasonableness of two possible extracellular space schema was evaluated by varying the T2 of the edema compartment and calculating the possible resulting T2s in tumor and peripheral edema. In the mathematical model, gliomas were comprised of vasculature and three tumor cellular phenotypes: normoxic, hypoxic, and necrotic. Edema was characterized as fluid leaking from abnormal tumor vessels. Spatial maps of tumor cell density and edema for virtual tumors were simulated with different rates of proliferation and invasion and various ECS expansion schemes. These spatial maps were then passed into a multi-compartmental MRI signal model for generating simulated T2/FLAIR MR images. Individual compartments’ T2 values in the signal equation were either from literature or estimated and the T2 for edema specifically was varied over a wide range (200 ms – 9200 ms). T2 maps were calculated from simulated images. T2 values based on simulated images were evaluated for regions of interest (ROIs) in normal appearing white matter, tumor, and peripheral edema. The ROI T2 values were compared to T2 values reported in literature. The expanding scheme of extracellular space is had T2 values similar to the literature calculated values. The static scheme of extracellular space had a much lower T2 values and no matter what T2 was associated with edema, the intensities did not come close to literature values. Expanding the extracellular space is necessary to achieve simulated edema intensities commiserate with acquired MRIs.

Keywords: extracellular space, glioblastoma multiforme, magnetic resonance imaging, mathematical modeling

Procedia PDF Downloads 221
428 Data-Driven Surrogate Models for Damage Prediction of Steel Liquid Storage Tanks under Seismic Hazard

Authors: Laura Micheli, Majd Hijazi, Mahmoud Faytarouni

Abstract:

The damage reported by oil and gas industrial facilities revealed the utmost vulnerability of steel liquid storage tanks to seismic events. The failure of steel storage tanks may yield devastating and long-lasting consequences on built and natural environments, including the release of hazardous substances, uncontrolled fires, and soil contamination with hazardous materials. It is, therefore, fundamental to reliably predict the damage that steel liquid storage tanks will likely experience under future seismic hazard events. The seismic performance of steel liquid storage tanks is usually assessed using vulnerability curves obtained from the numerical simulation of a tank under different hazard scenarios. However, the computational demand of high-fidelity numerical simulation models, such as finite element models, makes the vulnerability assessment of liquid storage tanks time-consuming and often impractical. As a solution, this paper presents a surrogate model-based strategy for predicting seismic-induced damage in steel liquid storage tanks. In the proposed strategy, the surrogate model is leveraged to reduce the computational demand of time-consuming numerical simulations. To create the data set for training the surrogate model, field damage data from past earthquakes reconnaissance surveys and reports are collected. Features representative of steel liquid storage tank characteristics (e.g., diameter, height, liquid level, yielding stress) and seismic excitation parameters (e.g., peak ground acceleration, magnitude) are extracted from the field damage data. The collected data are then utilized to train a surrogate model that maps the relationship between tank characteristics, seismic hazard parameters, and seismic-induced damage via a data-driven surrogate model. Different types of surrogate algorithms, including naïve Bayes, k-nearest neighbors, decision tree, and random forest, are investigated, and results in terms of accuracy are reported. The model that yields the most accurate predictions is employed to predict future damage as a function of tank characteristics and seismic hazard intensity level. Results show that the proposed approach can be used to estimate the extent of damage in steel liquid storage tanks, where the use of data-driven surrogates represents a viable alternative to computationally expensive numerical simulation models.

Keywords: damage prediction , data-driven model, seismic performance, steel liquid storage tanks, surrogate model

Procedia PDF Downloads 131
427 Cloud-Based Multiresolution Geodata Cube for Efficient Raster Data Visualization and Analysis

Authors: Lassi Lehto, Jaakko Kahkonen, Juha Oksanen, Tapani Sarjakoski

Abstract:

The use of raster-formatted data sets in geospatial analysis is increasing rapidly. At the same time, geographic data are being introduced into disciplines outside the traditional domain of geoinformatics, like climate change, intelligent transport, and immigration studies. These developments call for better methods to deliver raster geodata in an efficient and easy-to-use manner. Data cube technologies have traditionally been used in the geospatial domain for managing Earth Observation data sets that have strict requirements for effective handling of time series. The same approach and methodologies can also be applied in managing other types of geospatial data sets. A cloud service-based geodata cube, called GeoCubes Finland, has been developed to support online delivery and analysis of most important geospatial data sets with national coverage. The main target group of the service is the academic research institutes in the country. The most significant aspects of the GeoCubes data repository include the use of multiple resolution levels, cloud-optimized file structure, and a customized, flexible content access API. Input data sets are pre-processed while being ingested into the repository to bring them into a harmonized form in aspects like georeferencing, sampling resolutions, spatial subdivision, and value encoding. All the resolution levels are created using an appropriate generalization method, selected depending on the nature of the source data set. Multiple pre-processed resolutions enable new kinds of online analysis approaches to be introduced. Analysis processes based on interactive visual exploration can be effectively carried out, as the level of resolution most close to the visual scale can always be used. In the same way, statistical analysis can be carried out on resolution levels that best reflect the scale of the phenomenon being studied. Access times remain close to constant, independent of the scale applied in the application. The cloud service-based approach, applied in the GeoCubes Finland repository, enables analysis operations to be performed on the server platform, thus making high-performance computing facilities easily accessible. The developed GeoCubes API supports this kind of approach for online analysis. The use of cloud-optimized file structures in data storage enables the fast extraction of subareas. The access API allows for the use of vector-formatted administrative areas and user-defined polygons as definitions of subareas for data retrieval. Administrative areas of the country in four levels are available readily from the GeoCubes platform. In addition to direct delivery of raster data, the service also supports the so-called virtual file format, in which only a small text file is first downloaded. The text file contains links to the raster content on the service platform. The actual raster data is downloaded on demand, from the spatial area and resolution level required in each stage of the application. By the geodata cube approach, pre-harmonized geospatial data sets are made accessible to new categories of inexperienced users in an easy-to-use manner. At the same time, the multiresolution nature of the GeoCubes repository facilitates expert users to introduce new kinds of interactive online analysis operations.

Keywords: cloud service, geodata cube, multiresolution, raster geodata

Procedia PDF Downloads 117
426 A New Measurement for Assessing Constructivist Learning Features in Higher Education: Lifelong Learning in Applied Fields (LLAF) Tempus Project

Authors: Dorit Alt, Nirit Raichel

Abstract:

Although university teaching is claimed to have a special task to support students in adopting ways of thinking and producing new knowledge anchored in scientific inquiry practices, it is argued that students' habits of learning are still overwhelmingly skewed toward passive acquisition of knowledge from authority sources rather than from collaborative inquiry activities.This form of instruction is criticized for encouraging students to acquire inert knowledge that can be used in instructional settings at best, however cannot be transferred into real-life complex problem settings. In order to overcome this critical inadequacy between current educational goals and instructional methods, the LLAF consortium (including 16 members from 8 countries) is aimed at developing updated instructional practices that put a premium on adaptability to the emerging requirements of present society. LLAF has created a practical guide for teachers containing updated pedagogical strategies and assessment tools, based on the constructivist approach for learning that put a premium on adaptability to the emerging requirements of present society. This presentation will be limited to teachers' education only and to the contribution of the project in providing a scale designed to measure the extent to which the constructivist activities are efficiently applied in the learning environment. A mix-method approach was implemented in two phases to construct the scale: The first phase included a qualitative content analysis involving both deductive and inductive category applications of students' observations. The results foregrounded eight categories: knowledge construction, authenticity, multiple perspectives, prior knowledge, in-depth learning, teacher- student interaction, social interaction and cooperative dialogue. The students' descriptions of their classes were formulated as 36 items. The second phase employed structural equation modeling (SEM). The scale was submitted to 597 undergraduate students. The goodness of fit of the data to the structural model yielded sufficient fit results. This research elaborates the body of literature by adding a category of in-depth learning which emerged from the content analysis. Moreover, the theoretical category of social activity has been extended to include two distinctive factors: cooperative dialogue and social interaction. Implications of these findings for the LLAF project are discussed.

Keywords: constructivist learning, higher education, mix-methodology, structural equation modeling

Procedia PDF Downloads 300
425 On-Chip Ku-Band Bandpass Filter with Compact Size and Wide Stopband

Authors: Jyh Sheen, Yang-Hung Cheng

Abstract:

This paper presents a design of a microstrip bandpass filter with a compact size and wide stopband by using 0.15-μm GaAs pHEMT process. The wide stop band is achieved by suppressing the first and second harmonic resonance frequencies. The slow-wave coupling stepped impedance resonator with cross coupled structure is adopted to design the bandpass filter. A two-resonator filter was fabricated with 13.5GHz center frequency and 11% bandwidth was achieved. The devices are simulated using the ADS design software. This device has shown a compact size and very low insertion loss of 2.6 dB. Microstrip planar bandpass filters have been widely adopted in various communication applications due to the attractive features of compact size and ease of fabricating. Various planar resonator structures have been suggested. In order to reach a wide stopband to reduce the interference outside the passing band, various designs of planar resonators have also been submitted to suppress the higher order harmonic frequencies of the designed center frequency. Various modifications to the traditional hairpin structure have been introduced to reduce large design area of hairpin designs. The stepped-impedance, slow-wave open-loop, and cross-coupled resonator structures have been studied to miniaturize the hairpin resonators. In this study, to suppress the spurious harmonic bands and further reduce the filter size, a modified hairpin-line bandpass filter with cross coupled structure is suggested by introducing the stepped impedance resonator design as well as the slow-wave open-loop resonator structure. In this way, very compact circuit size as well as very wide upper stopband can be achieved and realized in a Roger 4003C substrate. On the other hand, filters constructed with integrated circuit technology become more attractive for enabling the integration of the microwave system on a single chip (SOC). To examine the performance of this design structure at the integrated circuit, the filter is fabricated by the 0.15 μm pHEMT GaAs integrated circuit process. This pHEMT process can also provide a much better circuit performance for high frequency designs than those made on a PCB board. The design example was implemented in GaAs with center frequency at 13.5 GHz to examine the performance in higher frequency in detail. The occupied area is only about 1.09×0.97 mm2. The ADS software is used to design those modified filters to suppress the first and second harmonics.

Keywords: microstrip resonator, bandpass filter, harmonic suppression, GaAs

Procedia PDF Downloads 312
424 An Audit on the Role of Sentinel Node Biopsy in High-Risk Ductal Carcinoma in Situ and Intracystic Papillary Carcinoma

Authors: M. Sulieman, H. Arabiyat, H. Ali, K. Potiszil, I. Abbas, R. English, P. King, I. Brown, P. Drew

Abstract:

Introduction: The incidence of breast ductal Carcinoma in Situ (DCIS) has been increasing; it currently represents up 20-25% of all breast carcinomas. Some aspects of DCIS management are still controversial, mainly due to the heterogeneity of its clinical presentation and of its biological and pathological characteristics. In DCIS, histological diagnosis obtained preoperatively, carries the risk of sampling error if the presence of invasive cancer is subsequently diagnosed. The mammographic extent over than 4–5 cm and the presence of architectural distortion, focal asymmetric density or mass on mammography are proven important risk factors of preoperative histological under staging. Intracystic papillary cancer (IPC) is a rare form of breast carcinoma. Despite being previously compared to DCIS it has been shown to present histologically with invasion of the basement membrane and even metastasis. SLNB – Carries the risk of associated comorbidity that should be considered when planning surgery for DCIS and IPC. Objectives: The aim of this Audit was to better define a ‘high risk’ group of patients with pre-op diagnosis of non-invasive cancer undergoing breast conserving surgery, who would benefit from sentinel node biopsy. Method: Retrospective data collection of all patients with ductal carcinoma in situ over 5 years. 636 patients identified, and after exclusion criteria applied: 394 patients were included. High risk defined as: Extensive micro-calcification >40mm OR any mass forming DCIS. IPC: Winpath search from for the term ‘papillary carcinoma’ in any breast specimen for 5 years duration;.29 patients were included in this group. Results: DCIS: 188 deemed high risk due to >40mm calcification or a mass forming (radiological or palpable) 61% of those had a mastectomy and 32% BCS. Overall, in that high-risk group - the number with invasive disease was 38%. Of those high-risk DCIS pts 85% had a SLN - 80% at the time of surgery and 5% at a second operation. For the BCS patients - 42% had SLN at time of surgery and 13% (8 patients) at a second operation. 15 (7.9%) pts in the high-risk group had a positive SLNB, 11 having a mastectomy and 4 having BCS. IPC: The provisional diagnosis of encysted papillary carcinoma is upgraded to an invasive carcinoma on final histology in around a third of cases. This has may have implications when deciding whether to offer sentinel node removal at the time of therapeutic surgery. Conclusions: We have defined a ‘high risk’ group of pts with pre-op diagnosis of non-invasive cancer undergoing BCS, who would benefit from SLNB at the time of the surgery. In patients with high-risk features; the risk of invasive disease is up to 40% but the risk of nodal involvement is approximately 8%. The risk of morbidity from SLN is up to about 5% especially the risk of lymphedema.

Keywords: breast ductal carcinoma in Situ (DCIS), intracystic papillary carcinoma (IPC), sentinel node biopsy (SLNB), high-risk, non-invasive, cancer disease

Procedia PDF Downloads 90
423 Production of Insulin Analogue SCI-57 by Transient Expression in Nicotiana benthamiana

Authors: Adriana Muñoz-Talavera, Ana Rosa Rincón-Sánchez, Abraham Escobedo-Moratilla, María Cristina Islas-Carbajal, Miguel Ángel Gómez-Lim

Abstract:

The highest rates of diabetes incidence and prevalence worldwide will increase the number of diabetic patients requiring insulin or insulin analogues. Then, current production systems would not be sufficient to meet the future market demands. Therefore, developing efficient expression systems for insulin and insulin analogues are needed. In addition, insulin analogues with better pharmacokinetics and pharmacodynamics properties and without mitogenic potential will be required. SCI-57 (single chain insulin-57) is an insulin analogue having 10 times greater affinity to the insulin receptor, higher resistance to thermal degradation than insulin, native mitogenicity and biological effect. Plants as expression platforms have been used to produce recombinant proteins because of their advantages such as cost-effectiveness, posttranslational modifications, absence of human pathogens and high quality. Immunoglobulin production with a yield of 50% has been achieved by transient expression in Nicotiana benthamiana (Nb). The aim of this study is to produce SCI-57 by transient expression in Nb. Methodology: DNA sequence encoding SCI-57 was cloned in pICH31070. This construction was introduced into Agrobacterium tumefaciens by electroporation. The resulting strain was used to infiltrate leaves of Nb. In order to isolate SCI-57, leaves from transformed plants were incubated 3 hours with the extraction buffer therefore filtrated to remove solid material. The resultant protein solution was subjected to anion exchange chromatography on an FPLC system and ultrafiltration to purify SCI-57. Detection of SCI-57 was made by electrophoresis pattern (SDS-PAGE). Protein band was digested with trypsin and the peptides were analyzed by Liquid chromatography tandem-mass spectrometry (LC-MS/MS). A purified protein sample (20µM) was analyzed by ESI-Q-TOF-MS to obtain the ionization pattern and the exact molecular weight determination. Chromatography pattern and impurities detection were performed using RP-HPLC using recombinant insulin as standard. The identity of the SCI-57 was confirmed by anti-insulin ELISA. The total soluble protein concentration was quantified by Bradford assay. Results: The expression cassette was verified by restriction mapping (5393 bp fragment). The SDS-PAGE of crude leaf extract (CLE) of transformed plants, revealed a protein of about 6.4 kDa, non-present in CLE of untransformed plants. The LC-MS/MS results displayed one peptide with a high score that matches SCI-57 amino acid sequence in the sample, confirming the identity of SCI-57. From the purified SCI-57 sample (PSCI-57) the most intense charge state was 1069 m/z (+6) on the displayed ionization pattern corresponding to the molecular weight of SCI-57 (6412.6554 Da). The RP-HPLC of the PSCI-57 shows the presence of a peak with similar retention time (rt) and UV spectroscopic profile to the insulin standard (SCI-57 rt=12.96 and insulin rt=12.70 min). The collected SCI-57 peak had ELISA signal. The total protein amount in CLE from transformed plants was higher compared to untransformed plants. Conclusions: Our results suggest the feasibility to produce insulin analogue SCI-57 by transient expression in Nicotiana benthamiana. Further work is being undertaken to evaluate the biological activity by glucose uptake by insulin-sensitive and insulin-resistant murine and human cultured adipocytes.

Keywords: insulin analogue, mass spectrometry, Nicotiana benthamiana, transient expression

Procedia PDF Downloads 331
422 BiVO₄‑Decorated Graphite Felt as Highly Efficient Negative Electrode for All-Vanadium Redox Flow Batteries

Authors: Daniel Manaye Kabtamu, Anteneh Wodaje Bayeh

Abstract:

With the development and utilization of new energy technology, people’s demand for large-scale energy storage system has become increasingly urgent. Vanadium redox flow battery (VRFB) is one of the most promising technologies for grid-scale energy storage applications because of numerous attractive features, such as long cycle life, high safety, and flexible design. However, the relatively low energy efficiency and high production cost of the VRFB still limit its practical implementations. It is of great attention to enhance its energy efficiency and reduce its cost. One of the main components of VRFB that can impressively impact the efficiency and final cost is the electrode materials, which provide the reactions sites for redox couples (V₂₊/V³⁺ and VO²⁺/VO₂⁺). Graphite felt (GF) is a typical carbon-based material commonly employed as electrode for VRFB due to low-cost, good chemical and mechanical stability. However, pristine GF exhibits insufficient wettability, low specific surface area, and poor kinetics reversibility, leading to low energy efficiency of the battery. Therefore, it is crucial to further modify the GF electrode to improve its electrochemical performance towards VRFB by employing active electrocatalysts, such as less expensive metal oxides. This study successfully fabricates low-cost plate-like bismuth vanadate (BiVO₄) material through a simple one-step hydrothermal route, employed as an electrocatalyst to adorn the GF for use as the negative electrode in VRFB. The experimental results show that BiVO₄-3h exhibits the optimal electrocatalytic activity and reversibility for the vanadium redox couples among all samples. The energy efficiency of the VRFB cell assembled with BiVO₄-decorated GF as the negative electrode is found to be 75.42% at 100 mA cm−2, which is about 10.24% more efficient than that of the cell assembled with heat-treated graphite felt (HT-GF) electrode. The possible reasons for the activity enhancement can be ascribed to the existence of oxygen vacancies in the BiVO₄ lattice structure and the relatively high surface area of BiVO₄, which provide more active sites for facilitating the vanadium redox reactions. Furthermore, the BiVO₄-GF electrode obstructs the competitive irreversible hydrogen evolution reaction on the negative side of the cell, and it also has better wettability. Impressively, BiVO₄-GF as the negative electrode shows good stability over 100 cycles. Thus, BiVO₄-GF is a promising negative electrode candidate for practical VRFB applications.

Keywords: BiVO₄ electrocatalyst, electrochemical energy storage, graphite felt, vanadium redox flow battery

Procedia PDF Downloads 1550
421 Testing the Impact of the Nature of Services Offered on Travel Sites and Links on Traffic Generated: A Longitudinal Survey

Authors: Rania S. Hussein

Abstract:

Background: This study aims to determine the evolution of service provision by Egyptian travel sites and how these services change in terms of their level of sophistication over the period of the study which is ten years. To the author’s best knowledge, this is the first longitudinal study that focuses on an extended time frame of ten years. Additionally, the study attempts to determine the popularity of these websites through the number of links to these sites. Links maybe viewed as the equivalent of a referral or word of mouth but in an online context. Both popularity and the nature of the services provided by these websites are used to determine the traffic on these sites. In examining the nature of services provided, the website itself is viewed as an overall service offering that is composed of different travel products and services. Method: This study uses content analysis in the form of a small scale survey done on 30 Egyptian travel agents’ websites to examine whether Egyptian travel websites are static or dynamic in terms of the services that they provide and whether they provide simple or sophisticated travel services. To determine the level of sophistication of these travel sites, the nature and composition of products and services offered by these sites were first examined. A framework adapted from Kotler (1997) 'Five levels of a product' was used. The target group for this study consists of companies that do inbound tourism. Four rounds of data collection were conducted over a period of 10 years. Two rounds of data collection were made in 2004 and two rounds were made in 2014. Data from the travel agents’ sites were collected over a two weeks period in each of the four rounds. Besides collecting data on features of websites, data was also collected on the popularity of these websites through a software program called Alexa that showed the traffic rank and number of links of each site. Regression analysis was used to test the effect of links and services on websites as independent variables on traffic as the dependent variable of this study. Findings: Results indicate that as companies moved from having simple websites with basic travel information to being more interactive, the number of visitors illustrated by traffic and the popularity of those sites increase as shown by the number of links. Results also show that travel companies use the web much more for promotion rather than for distribution since most travel agents are using it basically for information provision. The results of this content analysis study taps on an unexplored area and provide useful insights for marketers on how they can generate more traffic to their websites by focusing on developing a distinctive content on these sites and also by focusing on the visibility of their sites thus enhancing the popularity or links to their sites.

Keywords: levels of a product, popularity, travel, website evolution

Procedia PDF Downloads 302
420 The Research of Hand-Grip Strength for Adults with Intellectual Disability

Authors: Haiu-Lan Chin, Yu-Fen Hsiao, Hua-Ying Chuang, Wei Lee

Abstract:

An adult with intellectual disability generally has insufficient physical activity which is an important factor leading to premature weakness. Studies in recent years on frailty syndrome have accumulated substantial data about indicators of human aging, including unintentional weight loss, self-reported exhaustion, weakness, slow walking speed, and low physical activity. Of these indicators, hand-grip strength can be seen as a predictor of mortality, disability, complications, and increased length of hospital stay. Hand-grip strength in fact provides a comprehensive overview of one’s vitality. The research is about the investigation on hand-grip strength of adults with intellectual disabilities in facilities, institutions and workshops. The participants are 197 male adults (M=39.09±12.85 years old), and 114 female ones (M=35.80±8.2 years old) so far. The aim of the study is to figure out the performance of their hand-grip strength, and initiate the setting of training on hand-grip strength in their daily life which will decrease the weakening on their physical condition. Test items include weight, bone density, basal metabolic rate (BMR), static body balance except hand-grip strength. Hand-grip strength was measured by a hand dynamometer and classified as normal group ( ≧ 30 kg for male and ≧ 20 kg for female) and weak group ( < 30 kg for male, < 20 kg for female)The analysis includes descriptive statistics, and the indicators of grip strength fo the adults with intellectual disability. Though the research is still ongoing and the participants are increasing, the data indicates: (1) The correlation between hand-grip strength and degree of the intellectual disability (p ≦. 001), basal metabolic rate (p ≦ .001), and static body balance (p ≦ .01) as well. Nevertheless, there is no significant correlation between grip strength and basal metabolic rate which had been having significant correlation with hand-grip strength. (2) The difference between male and female subjects in hand-grip strength is significant, the hand-grip strength of male subjects (25.70±12.81 Kg) is much higher than female ones (16.30±8.89 Kg). Compared to the female counterparts, male participants indicate greater individual differences. And the proportion of weakness between male and female subjects is also different. (3) The regression indicates the main factors related to grip strength performance include degree of the intellectual disability, height, static body balance, training and weight sequentially. (4) There is significant difference on both hand-grip and static body balance between participants in facilities and workshops. The study supports the truth about the sex and gender differences in health. Nevertheless, the average hand-grip strength of left hand is higher than right hand in both male and female subjects. Moreover, 71.3% of male subjects and 64.2% of female subjects have better performance in their left hand-grip which is distinctive features especially in low degree of the intellectual disability.

Keywords: adult with intellectual disability, frailty syndrome, grip strength, physical condition

Procedia PDF Downloads 162
419 Mikrophonie I (1964) by Karlheinz Stockhausen - Between Idea and Auditory Image

Authors: Justyna Humięcka-Jakubowska

Abstract:

1. Background in music analysis. Traditionally, when we think about a composer’s sketches, the chances are that we are thinking in terms of the working out of detail, rather than the evolution of an overall concept. Since music is a “time art’, it follows that questions of a form cannot be entirely detached from considerations of time. One could say that composers tend to regard time either as a place gradually and partially intuitively filled, or they can look for a specific strategy to occupy it. In my opinion, one thing that sheds light on Stockhausen's compositional thinking is his frequent use of 'form schemas', that is often a single-page representation of the entire structure of a piece. 2. Background in music technology. Sonic Visualiser is a program used to study a musical recording. It is an open source application for viewing, analysing, and annotating music audio files. It contains a number of visualisation tools, which are designed with useful default parameters for musical analysis. Additionally, the Vamp plugin format of SV supports to provide analysis such as for example structural segmentation. 3. Aims. The aim of my paper is to show how SV may be used to obtain a better understanding of the specific musical work, and how the compositional strategy does impact on musical structures and musical surfaces. I want to show that ‘traditional” music analytic methods don’t allow to indicate interrelationships between musical surface (which is perceived) and underlying musical/acoustical structure. 4. Main Contribution. Stockhausen had dealt with the most diverse musical problems by the most varied methods. A characteristic which he had never ceased to be placed at the center of his thought and works, it was the quest for a new balance founded upon an acute connection between speculation and intuition. In the case with Mikrophonie I (1964) for tam-tam and 6 players Stockhausen makes a distinction between the "connection scheme", which indicates the ground rules underlying all versions, and the form scheme, which is associated with a particular version. The preface to the published score includes both the connection scheme, and a single instance of a "form scheme", which is what one can hear on the CD recording. In the current study, the insight into the compositional strategy chosen by Stockhausen was been compared with auditory image, that is, with the perceived musical surface. Stockhausen's musical work is analyzed both in terms of melodic/voice and timbre evolution. 5. Implications The current study shows how musical structures have determined of musical surface. My general assumption is this, that while listening to music we can extract basic kinds of musical information from musical surfaces. It is shown that an interactive strategies of musical structure analysis can offer a very fruitful way of looking directly into certain structural features of music.

Keywords: automated analysis, composer's strategy, mikrophonie I, musical surface, stockhausen

Procedia PDF Downloads 283
418 Attitude in Academic Writing (CAAW): Corpus Compilation and Annotation

Authors: Hortènsia Curell, Ana Fernández-Montraveta

Abstract:

This paper presents the creation, development, and analysis of a corpus designed to study the presence of attitude markers and author’s stance in research articles in two different areas of linguistics (theoretical linguistics and sociolinguistics). These two disciplines are expected to behave differently in this respect, given the disparity in their discursive conventions. Attitude markers in this work are understood as the linguistic elements (adjectives, nouns and verbs) used to convey the writer's stance towards the content presented in the article, and are crucial in understanding writer-reader interaction and the writer's position. These attitude markers are divided into three broad classes: assessment, significance, and emotion. In addition to them, we also consider first-person singular and plural pronouns and possessives, modal verbs, and passive constructions, which are other linguistic elements expressing the author’s stance. The corpus, Corpus of Attitude in Academic Writing (CAAW), comprises a collection of 21 articles, collected from six journals indexed in JCR. These articles were originally written in English by a single native-speaker author from the UK or USA and were published between 2022 and 2023. The total number of words in the corpus is approximately 222,400, with 106,422 from theoretical linguistics (Lingua, Linguistic Inquiry and Journal of Linguistics) and 116,022 from sociolinguistics journals (International Journal of the Sociology of Language, Language in Society and Journal of Sociolinguistics). Together with the corpus, we present the tool created for the creation and storage of the corpus, along with a tool for automatic annotation. The steps followed in the compilation of the corpus are as follows. First, the articles were selected according to the parameters explained above. Second, they were downloaded and converted to txt format. Finally, examples, direct quotes, section titles and references were eliminated, since they do not involve the author’s stance. The resulting texts were the input for the annotation of the linguistic features related to stance. As for the annotation, two articles (one from each subdiscipline) were annotated manually by the two researchers. An existing list was used as a baseline, and other attitude markers were identified, together with the other elements mentioned above. Once a consensus was reached, the rest of articles were annotated automatically using the tool created for this purpose. The annotated corpus will serve as a resource for scholars working in discourse analysis (both in linguistics and communication) and related fields, since it offers new insights into the expression of attitude. The tools created for the compilation and annotation of the corpus will be useful to study author’s attitude and stance in articles from any academic discipline: new data can be uploaded and the list of markers can be enlarged. Finally, the tool can be expanded to other languages, which will allow cross-linguistic studies of author’s stance.

Keywords: academic writing, attitude, corpus, english

Procedia PDF Downloads 47
417 Shark Detection and Classification with Deep Learning

Authors: Jeremy Jenrette, Z. Y. C. Liu, Pranav Chimote, Edward Fox, Trevor Hastie, Francesco Ferretti

Abstract:

Suitable shark conservation depends on well-informed population assessments. Direct methods such as scientific surveys and fisheries monitoring are adequate for defining population statuses, but species-specific indices of abundance and distribution coming from these sources are rare for most shark species. We can rapidly fill these information gaps by boosting media-based remote monitoring efforts with machine learning and automation. We created a database of shark images by sourcing 24,546 images covering 219 species of sharks from the web application spark pulse and the social network Instagram. We used object detection to extract shark features and inflate this database to 53,345 images. We packaged object-detection and image classification models into a Shark Detector bundle. We developed the Shark Detector to recognize and classify sharks from videos and images using transfer learning and convolutional neural networks (CNNs). We applied these models to common data-generation approaches of sharks: boosting training datasets, processing baited remote camera footage and online videos, and data-mining Instagram. We examined the accuracy of each model and tested genus and species prediction correctness as a result of training data quantity. The Shark Detector located sharks in baited remote footage and YouTube videos with an average accuracy of 89\%, and classified located subjects to the species level with 69\% accuracy (n =\ eight species). The Shark Detector sorted heterogeneous datasets of images sourced from Instagram with 91\% accuracy and classified species with 70\% accuracy (n =\ 17 species). Data-mining Instagram can inflate training datasets and increase the Shark Detector’s accuracy as well as facilitate archiving of historical and novel shark observations. Base accuracy of genus prediction was 68\% across 25 genera. The average base accuracy of species prediction within each genus class was 85\%. The Shark Detector can classify 45 species. All data-generation methods were processed without manual interaction. As media-based remote monitoring strives to dominate methods for observing sharks in nature, we developed an open-source Shark Detector to facilitate common identification applications. Prediction accuracy of the software pipeline increases as more images are added to the training dataset. We provide public access to the software on our GitHub page.

Keywords: classification, data mining, Instagram, remote monitoring, sharks

Procedia PDF Downloads 98
416 Strategic Metals and Rare Earth Elements Exploration of Lithium Cesium Tantalum Type Pegmatites: A Case Study from Northwest Himalayas

Authors: Auzair Mehmood, Mohammad Arif

Abstract:

The LCT (Li, Cs and Ta rich)-type pegmatites, genetically related to peraluminous S-type granites, are being mined for strategic metals (SMs) and rare earth elements (REEs) around the world. This study investigates the SMs and REEs potentials of pegmatites that are spatially associated with an S-type granitic suite of the Himalayan sequence, specifically Mansehra Granitic Complex (MGC), northwest Pakistan. Geochemical signatures of the pegmatites and some of their mineral extracts were analyzed using Inductive Coupled Plasma Mass Spectroscopy (ICP-MS) technique to explore and generate potential prospects (if any) for SMs and REEs. In general, the REE patterns of the studied whole-rock pegmatite samples show tetrad effect and possess low total REE abundances, strong positive Europium (Eu) anomalies, weak negative Cesium (Cs) anomalies and relative enrichment in heavy REE. Similar features have been observed on the REE patterns of the feldspar extracts. However, the REE patterns of the muscovite extracts reflect preferential enrichment and possess negative Eu anomalies. The trace element evaluation further suggests that the MGC pegmatites have undergone low levels of fractionation. Various trace elements concentrations (and their ratios) including Ta versus Cs, K/Rb (Potassium/Rubidium) versus Rb and Th/U (Thorium/Uranium) versus K/Cs, were used to analyze the economically viable mineral potential of the studied rocks. On most of the plots, concentrations fall below the dividing line and confer either barren or low-level mineralization potential of the studied rocks for both SMs and REEs. The results demonstrate paucity of the MGC pegmatites with respect to Ta-Nb (Tantalum-Niobium) mineralization, which is in sharp contrast to many Pan-African S-type granites around the world. The MGC pegmatites are classified as muscovite pegmatites based on their K/Rb versus Cs relationship. This classification is consistent with the occurrence of rare accessory minerals like garnet, biotite, tourmaline, and beryl. Furthermore, the classification corroborates with an earlier sorting of the MCG pegmatites into muscovite-bearing, biotite-bearing, and subordinate muscovite-biotite types. These types of pegmatites lack any significant SMs and REEs mineralization potentials. Field relations, such as close spatial association with parent granitic rocks and absence of internal zonation structure, also reflect the barren character and hence lack of any potential prospects of the MGC pegmatites.

Keywords: exploration, fractionation, Himalayas, pegmatites, rare earth elements

Procedia PDF Downloads 185
415 Post Liberal Perspective on Minorities Visibility in Contemporary Visual Culture: The Case of Mizrahi Jews

Authors: Merav Alush Levron, Sivan Rajuan Shtang

Abstract:

From as early as their emergence in Europe and the US, postmodern and post-colonial paradigm have formed the backbone of the visual culture field of study. The self-representation project of political minorities is studied, described and explained within the premises and perspectives drawn from these paradigms, addressing the key issues they had raised: modernism’s crisis of representation. The struggle for self-representation, agency and multicultural visibility sought to challenge the liberal pretense of universality and equality, hitting at its different blind spots, on issues such as class, gender, race, sex, and nationality. This struggle yielded subversive identity and hybrid performances, including reclaiming, mimicry and masquerading. These performances sought to defy the uniform, universal self, which forms the basis for the liberal, rational, enlightened subject. The argument of this research runs that this politics of representation itself is confined within liberal thought. Alongside post-colonialism and multiculturalism’s contribution in undermining oppressive structures of power, generating diversity in cultural visibility, and exposing the failure of liberal colorblindness, this subversion is constituted in the visual field by way of confrontation, flying in the face of the universal law and relying on its ongoing comparison and attribution to this law. Relying on Deleuze and Guattari, this research set out to draw theoretic and empiric attention to an alternative, post-liberal occurrence which has been taking place in the visual field in parallel to the contra-hegemonic phase and as a product of political reality in the aftermath of the crisis of representation. It is no longer a counter-representation; rather, it is a motion of organic minor desire, progressing in the form of flows and generating what Deleuze and Guattari termed deterritorialization of social structures. This discussion shall have its focus on current post-liberal performances of ‘Mizrahim’ (Jewish Israelis of Arab and Muslim extraction) in the visual field in Israel. In television, video art and photography, these performances challenge the issue of representation and generate concrete peripheral Mizrahiness, realized in the visual organization of the photographic frame. Mizrahiness then transforms from ‘confrontational’ representation into a 'presence', flooding the visual sphere in our plain sight, in a process of 'becoming'. The Mizrahi desire is exerted on the plains of sound, spoken language, the body and the space where they appear. It removes from these plains the coding and stratification engendered by European dominance and rational, liberal enlightenment. This stratification, adhering to the hegemonic surface, is flooded not by way of resisting false consciousness or employing hybridity, but by way of the Mizrahi identity’s own productive, material immanent yearning. The Mizrahi desire reverberates with Mizrahi peripheral 'worlds of meaning', where post-colonial interpretation almost invariably identifies a product of internalized oppression, and a recurrence thereof, rather than a source in itself - an ‘offshoot, never a wellspring’, as Nissim Mizrachi clarifies in his recent pioneering work. The peripheral Mizrahi performance ‘unhook itself’, in Deleuze and Guattari words, from the point of subjectification and interpretation and does not correspond with the partialness, absence, and split that mark post-colonial identities.

Keywords: desire, minority, Mizrahi Jews, post-colonialism, post-liberalism, visibility, Deleuze and Guattari

Procedia PDF Downloads 306