Search results for: stock market prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5824

Search results for: stock market prediction

1324 Cereal Bioproducts Conversion to Higher Value Feed by Using Pediococcus Strains Isolated from Spontaneous Fermented Cereal, and Its Influence on Milk Production of Dairy Cattle

Authors: Vita Krungleviciute, Rasa Zelvyte, Ingrida Monkeviciene, Jone Kantautaite, Rolandas Stankevicius, Modestas Ruzauskas, Elena Bartkiene

Abstract:

The environmental impact of agricultural bioproducts from the processing of food crops is an increasing concern worldwide. Currently, cereal bran has been used as a low-value ingredient for both human consumption and animal feed. The most popular bioprocessing technologies for cereal bran nutritional and technological functionality increasing are enzymatic processing and fermentation, and the most popular starters in fermented feed production are lactic acid bacteria (LAB) including pediococci. However, the ruminant digestive system is unique, there are billions of microorganisms which help the cow to digest and utilize nutrients in the feed. To achieve efficient feed utilization and high milk yield, the microorganisms must have optimal conditions, and the disbalance of this system is highly undesirable. Pediococcus strains Pediococcus acidilactici BaltBio01 and Pediococcus pentosaceus BaltBio02 from spontaneous fermented rye were isolated (by rep – PCR method), identified, and characterized by their growth (by Thermo Bioscreen C automatic turbidometer), acidification rate (2 hours in 2.5 pH), gas production (Durham method), and carbohydrate metabolism (by API 50 CH test ). Antimicrobial activities of isolated pediococcus against variety of pathogenic and opportunistic bacterial strains previously isolated from diseased cattle, and their resistance to antibiotics were evaluated (EFSA-FEEDAP method). The isolated pediococcus strains were cultivated in barley/wheat bran (90 / 10, m / m) substrate, and developed supplements, with high content of valuable pediococcus, were used for Lithuanian black and white dairy cows feeding. In addition, the influence of supplements on milk production and composition was determined. Milk composition was evaluated by the LactoScope FTIR” FT1.0. 2001 (Delta Instruments, Holland). P. acidilactici BaltBio01 and P. pentosaceus BaltBio02 demonstrated versatile carbohydrate metabolism, grown at 30°C and 37°C temperatures, and acidic tolerance. Isolated pediococcus strains showed to be non resistant to antibiotics, and having antimicrobial activity against undesirable microorganisms. By barley/wheat bran utilisation using fermentation with selected pediococcus strains, it is possible to produce safer (reduced Enterobacteriaceae, total aerobic bacteria, yeast and mold count) feed stock with high content of pediococcus. Significantly higher milk yield (after 33 days) by using pediococcus supplements mix for dairy cows feeding could be obtained, while similar effect by using separate strains after 66 days of feeding could be achieved. It can be stated that barley/wheat bran could be used for higher value feed production in order to increase milk production. Therefore, further research is needed to identify what is the main mechanism of the positive action.

Keywords: barley/wheat bran, dairy cattle, fermented feed, milk, pediococcus

Procedia PDF Downloads 284
1323 Development of Bioactive Medical Textiles by Immobilizing Nanoparticles at Cotton Fabric

Authors: Munir Ashraf, Shagufta Riaz

Abstract:

Personal protective equipment (PPE) and bioactive textiles are highly important for the health care of front line hospital workers, patients, and the general population to be safe from highly infectious diseases. This was even more critical in the wake of COVID-19 outbreak. Most of the medical textiles are inactive against various viruses and bacteria, hence there is a need to wash them frequently to avoid the spread of microorganisms. According to survey conducted by the world health organization, more than 500 million people get infected from hospitals, and more than 13 million died due to these hospitals’ acquired deadly diseases. The market available PPE are though effective against the penetration of pathogens and to kill bacteria but, they are not breathable and active against different viruses. Therefore, there was a great need to develop textiles that are not only effective against bacteria, fungi, and viruses but also are comfortable to the medical personnel and patients. In the present study, waterproof breathable, and biologically active textiles were developed using antiviral and antibacterial nanomaterials. These nanomaterials like TiO₂, ZnO, Cu, and Ag were immobilized at the surface of cotton fabric by using different silane coupling agents and electroless deposition that they retained their functionality even after 30 industrial laundering cycles. Afterwards, the treated fabrics were coated with a waterproof breathable film to prevent the permeation of liquid droplets, any particle or microorganisms greater than 80 nm. The developed cotton fabric was highly active against bacteria and viruses. The good durability of nanomaterials at the cotton surface after several industrial washing cycles makes this fabric an ideal candidate for bioactive textiles used in the medical field.

Keywords: antibacterial, antiviral, cotton, durable

Procedia PDF Downloads 139
1322 Calculational-Experimental Approach of Radiation Damage Parameters on VVER Equipment Evaluation

Authors: Pavel Borodkin, Nikolay Khrennikov, Azamat Gazetdinov

Abstract:

The problem of ensuring of VVER type reactor equipment integrity is now most actual in connection with justification of safety of the NPP Units and extension of their service life to 60 years and more. First of all, it concerns old units with VVER-440 and VVER-1000. The justification of the VVER equipment integrity depends on the reliability of estimation of the degree of the equipment damage. One of the mandatory requirements, providing the reliability of such estimation, and also evaluation of VVER equipment lifetime, is the monitoring of equipment radiation loading parameters. In this connection, there is a problem of justification of such normative parameters, used for an estimation of the pressure vessel metal embrittlement, as the fluence and fluence rate (FR) of fast neutrons above 0.5 MeV. From the point of view of regulatory practice, a comparison of displacement per atom (DPA) and fast neutron fluence (FNF) above 0.5 MeV has a practical concern. In accordance with the Russian regulatory rules, neutron fluence F(E > 0.5 MeV) is a radiation exposure parameter used in steel embrittlement prediction under neutron irradiation. However, the DPA parameter is a more physically legitimate quantity of neutron damage of Fe based materials. If DPA distribution in reactor structures is more conservative as neutron fluence, this case should attract the attention of the regulatory authority. The purpose of this work was to show what radiation load parameters (fluence, DPA) on all VVER equipment should be under control, and give the reasonable estimations of such parameters in the volume of all equipment. The second task is to give the conservative estimation of each parameter including its uncertainty. Results of recently received investigations allow to test the conservatism of calculational predictions, and, as it has been shown in the paper, combination of ex-vessel measured data with calculated ones allows to assess unpredicted uncertainties which are results of specific unique features of individual equipment for VVER reactor. Some results of calculational-experimental investigations are presented in this paper.

Keywords: equipment integrity, fluence, displacement per atom, nuclear power plant, neutron activation measurements, neutron transport calculations

Procedia PDF Downloads 132
1321 Feasibilities for Recovering of Precious Metals from Printed Circuit Board Waste

Authors: Simona Ziukaite, Remigijus Ivanauskas, Gintaras Denafas

Abstract:

Market development of electrical and electronic equipment and a short life cycle is driven by the increasing waste streams. Gold Au, copper Cu, silver Ag and palladium Pd can be found on printed circuit board. These metals make up the largest value of printed circuit board. Therefore, the printed circuit boards scrap is valuable as potential raw material for precious metals recovery. A comparison of Cu, Au, Ag, Pd recovery from waste printed circuit techniques was selected metals leaching of chemical reagents. The study was conducted using the selected multistage technique for Au, Cu, Ag, Pd recovery of printed circuit board. In the first and second metals leaching stages, as the elution reagent, 2M H2SO4 and H2O2 (35%) was used. In the third stage, leaching of precious metals used solution of 20 g/l of thiourea and 6 g/l of Fe2 (SO4)3. Verify the efficiency of the method was carried out the metals leaching test with aqua regia. Based on the experimental study, the leaching efficiency, using the preferred methodology, 60 % of Au and 85,5 % of Cu dissolution was achieved. Metals leaching efficiency after waste mechanical crushing and thermal treatment have been increased by 1,7 times (40 %) for copper, 1,6 times (37 %) for gold and 1,8 times (44 %) for silver. It was noticed that, the Au amount in old (> 20 years) waste is 17 times more, Cu amount - 4 times more, and Ag - 2 times more than in the new (< 1 years) waste. Palladium in the new printed circuit board waste has not been found, however, it was established that from 1 t of old printed circuit board waste can be recovered 1,064 g of Pd (leaching with aqua regia). It was found that from 1 t of old printed circuit board waste can be recovered 1,064 g of Ag. Precious metals recovery in Lithuania was estimated in this study. Given the amounts of generated printed circuit board waste, the limits for recovery of precious metals were identified.

Keywords: leaching efficiency, limits for recovery, precious metals recovery, printed circuit board waste

Procedia PDF Downloads 363
1320 Generation of Renewable Energy Through Photovoltaic Panels, Albania Photovoltaic Capacity

Authors: Dylber Qema

Abstract:

Driven by recent developments in technology and the growing concern about the sustainability and environmental impact of conventional fuel use, the possibility of producing clean and sustainable energy in significant quantities from renewable energy sources has sparked interest all over the world. Solar energy is one of the sources for the generation of electricity, with no emissions or environmental pollution. The electricity produced by photovoltaics can supply a home or business and can even be sold or exchanged with the grid operator. A very positive effect of using photovoltaic modules is that they do not produce greenhouse gases and do not produce chemical waste, unlike all other forms of energy production. Photovoltaics are becoming one of the largest investments in the field of renewable generating units. Improving the reliability of the electric power system is one of the most important impacts of the installation of photovoltaics (PV). Renewable energy sources are so large that they can meet the energy demands of the whole world, thus enabling sustainable supply as well as reducing local and global atmospheric emissions. Albania is rated by experts as one of the most favorable countries in Europe for the production of electricity from solar panels. But the country currently produces about 1% of its energy from the sun, while the rest of the needs are met by hydropower plants and imports. Albania has very good characteristics in terms of solar radiation (about 1300–1400 kW/m2). Solar energy has great potential and is a permanent source of energy with greater economic efficiency. Photovoltaic energy is also seen as an alternative, as long periods of drought in Albania have produced crises and high costs for securing energy in the foreign market.

Keywords: capacity, ministry of tourism and environment, obstacles, photovoltaic energy, sustainable

Procedia PDF Downloads 24
1319 The Effect Analysis of Monetary Instruments through Islamic Banking Financing Channel toward Economic Growth in Indonesia, Period January 2008-December 2015

Authors: Sobar M. Johari, Ida Putri Anjarsari

Abstract:

In the transmission of monetary instrument towards real sector of the economy, Bank Indonesia as monetary authority has developed Islamic Bank Indonesia Certificate (abbreviated as SBIS) as an instrument in Islamic open market operation. One of the monetary transmission channels could take place through financing channel from which the fund is used as the source of banking financing. This study aims to analyse the impact of Islamic monetary instrument towards output or economic growth. Data used in this research is taken from Bank Indonesia and Central Board of Statistics for the period of January 2008 until December 2015. The study employs Granger Causality Test, Vector Error Correction Model (VECM), Impulse Response Function (IRF) technique and Forecast Error Variance Decomposition (FEVD) as its analytical methods. The results show that, first, the transmission mechanism of banking financing channel are not linked to output. Second, estimation results of VECM show that SBIS, PUAS, and FIN have significant impact in the long term towards output. When there is monetary shock, output or economic growth could be recovered and stabilized in the short term. FEVD results show that Islamic banking financing contributes 1.33 percent to increase economic growth.

Keywords: Islamic monetary instrument, Islamic banking financing channel, economic growth, Vector Error Correction Model (VECM)

Procedia PDF Downloads 244
1318 Dry Season Rice Production along Hadejia Valley Irrigation Scheme in Auyo Local Government Area in Jigawa State

Authors: Saifullahi Umar, Baba Mamman Yarima, Mohammed Bello Usman, Hassan Mohammed

Abstract:

This study was conducted along with the Hadejia valley project irrigation under the Hadejia-Jama’are River Basin Development Authority (HRBDA) in Jigawa State. The multi-stage sampling procedure was used to select 72 rice farmers operating along with the Hadejia Valley Irrigation Project. Data for the study were collected using a structured questionnaire. The analytical tools employed for the study were descriptive statistics and Farm budget technique. The result shows that 55% of the farmers were between 31-40 years of age, 66.01% were male, and the result also revealed that the total cost of cultivation of an acre of land for rice production during the dry season was N73,900 with input cost accounting for 63.59% of the total cost of production. The gross return was N332,500, with a net return of N258,600 per acre. The estimated benefit-cost ratio of 3.449 indicates the strong performance of the dry season rice production. The leading constraints to dry season rice production were low access to quality extension services, low access to finance, poor quality fertilizers, and poor prices. The study, therefore, concludes that dry season rice production is a profitable enterprise in the study area hence, to productivity the farmers should be linked to effective extension service delivery institutions, expanding their access to productive sources of finances, the government should strengthen fertilizer quality control measures and comprehensive market linkages for the farmers.

Keywords: Auyo, dry season, Hadejia Valley, rice

Procedia PDF Downloads 130
1317 Measurement Technologies for Advanced Characterization of Magnetic Materials Used in Electric Drives and Automotive Applications

Authors: Lukasz Mierczak, Patrick Denke, Piotr Klimczyk, Stefan Siebert

Abstract:

Due to the high complexity of the magnetization in electrical machines and influence of the manufacturing processes on the magnetic properties of their components, the assessment and prediction of hysteresis and eddy current losses has remained a challenge. In the design process of electric motors and generators, the power losses of stators and rotors are calculated based on the material supplier’s data from standard magnetic measurements. This type of data does not include the additional loss from non-sinusoidal multi-harmonic motor excitation nor the detrimental effects of residual stress remaining in the motor laminations after manufacturing processes, such as punching, housing shrink fitting and winding. Moreover, in production, considerable attention is given to the measurements of mechanical dimensions of stator and rotor cores, whereas verification of their magnetic properties is typically neglected, which can lead to inconsistent efficiency of assembled motors. Therefore, to enable a comprehensive characterization of motor materials and components, Brockhaus Measurements developed a range of in-line and offline measurement technologies for testing their magnetic properties under actual motor operating conditions. Multiple sets of experimental data were obtained to evaluate the influence of various factors, such as elevated temperature, applied and residual stress, and arbitrary magnetization on the magnetic properties of different grades of non-oriented steel. Measured power loss for tested samples and stator cores varied significantly, by more than 100%, comparing to standard measurement conditions. Quantitative effects of each of the applied measurement were analyzed. This research and applied Brockhaus measurement methodologies emphasized the requirement for advanced characterization of magnetic materials used in electric drives and automotive applications.

Keywords: magnetic materials, measurement technologies, permanent magnets, stator and rotor cores

Procedia PDF Downloads 122
1316 Farmers’ Access to Agricultural Extension Services Delivery Systems: Evidence from a Field Study in India

Authors: Ankit Nagar, Dinesh Kumar Nauriyal, Sukhpal Singh

Abstract:

This paper examines the key determinants of farmers’ access to agricultural extension services, sources of agricultural extension services preferred and accessed by the farmers. An ordered logistic regression model was used to analyse the data of the 360 sample households based on a primary survey conducted in western Uttar Pradesh, India. The study finds that farmers' decision to engage in the agricultural extension programme is significantly influenced by factors such as education level, gender, farming experience, social group, group membership, farm size, credit access, awareness about the extension scheme, farmers' perception, and distance from extension sources. The most intriguing finding of this study is that the progressive farmers, which have long been regarded as a major source of knowledge diffusion, are the most distrusted sources of information as they are suspected of withholding vital information from potential beneficiaries. The positive relationship between farm size and ‘Access’ underlines that the extension services should revisit their strategies for targeting more marginal and small farmers constituting over 85 percent of the agricultural households by incorporating their priorities in their outreach programs. The study suggests that marginal and small farmers' productive potential could still be greatly augmented by the appropriate technology, advisory services, guidance, and improved market access. Also, the perception of poor quality of the public extension services can be corrected by initiatives aimed at building up extension workers' capacity.

Keywords: agriculture, access, extension services, ordered logistic regression

Procedia PDF Downloads 178
1315 A Paradigm Model of Educational Policy Review Strategies to Develop Professional Schools

Authors: Farhad Shafiepour Motlagh, Narges Salehi

Abstract:

Purpose: The aim of the present study was a paradigm model of educational policy review strategies for the development of Professional schools in Iran. Research Methodology: The research method was based on Grounded theory. The statistical population included all articles of the ten years 2022-2010 and the method of sampling in a purposeful manner to the extent of theoretical saturation to 31 articles. For data analysis, open coding, axial coding and selective coding were used. Results: The results showed that causal conditions include social requirements (social expectations, educational justice, social justice); technology requirements (use of information and communication technology, use of new learning methods), educational requirements (development of educational territory, Development of educational tools and development of learning methods), contextual conditions including dual dimensions (motivational-psychological context, context of participation and cooperation), strategic conditions including (decentralization, delegation, organizational restructuring), intervention conditions (poor knowledge) Human resources, centralized system governance) and outcomes (school productivity, school professionalism, graduate entry into the labor market) were obtained. Conclusion: A review of educational policy is necessary to develop Iran's Professional schools, and this depends on decentralization, delegation, and, of course, empowerment of school principals.

Keywords: school productivity, professional schools, educational policy, paradigm

Procedia PDF Downloads 171
1314 An Analysis of the Regression Hypothesis from a Shona Broca’s Aphasci Perspective

Authors: Esther Mafunda, Simbarashe Muparangi

Abstract:

The present paper tests the applicability of the Regression Hypothesis on the pathological language dissolution of a Shona male adult with Broca’s aphasia. It particularly assesses the prediction of the Regression Hypothesis, which states that the process according to which language is forgotten will be the reversal of the process according to which it will be acquired. The main aim of the paper is to find out whether mirror symmetries between L1 acquisition and L1 dissolution of tense in Shona and, if so, what might cause these regression patterns. The paper also sought to highlight the practical contributions that Linguistic theory can make to solving language-related problems. Data was collected from a 46-year-old male adult with Broca’s aphasia who was receiving speech therapy at St Giles Rehabilitation Centre in Harare, Zimbabwe. The primary data elicitation method was experimental, using the probe technique. The TART (Test for Assessing Reference Time) Shona version in the form of sequencing pictures was used to access tense by Broca’s aphasic and 3.5-year-old child. Using the SPSS (Statistical Package for Social Studies) and Excel analysis, it was established that the use of the future tense was impaired in Shona Broca’s aphasic whilst the present and past tense was intact. However, though the past tense was intact in the male adult with Broca’s aphasic, a reference to the remote past was made. The use of the future tense was also found to be difficult for the 3,5-year-old speaking child. No difficulties were encountered in using the present and past tenses. This means that mirror symmetries were found between L1 acquisition and L1 dissolution of tense in Shona. On the basis of the results of this research, it can be concluded that the use of tense in a Shona adult with Broca’s aphasia supports the Regression Hypothesis. The findings of this study are important in terms of speech therapy in the context of Zimbabwe. The study also contributes to Bantu linguistics in general and to Shona linguistics in particular. Further studies could also be done focusing on the rest of the Bantu language varieties in terms of aphasia.

Keywords: Broca’s Aphasia, regression hypothesis, Shona, language dissolution

Procedia PDF Downloads 61
1313 Image Recognition Performance Benchmarking for Edge Computing Using Small Visual Processing Unit

Authors: Kasidis Chomrat, Nopasit Chakpitak, Anukul Tamprasirt, Annop Thananchana

Abstract:

Internet of Things devices or IoT and Edge Computing has become one of the biggest things happening in innovations and one of the most discussed of the potential to improve and disrupt traditional business and industry alike. With rises of new hang cliff challenges like COVID-19 pandemic that posed a danger to workforce and business process of the system. Along with drastically changing landscape in business that left ruined aftermath of global COVID-19 pandemic, looming with the threat of global energy crisis, global warming, more heating global politic that posed a threat to become new Cold War. How emerging technology like edge computing and usage of specialized design visual processing units will be great opportunities for business. The literature reviewed on how the internet of things and disruptive wave will affect business, which explains is how all these new events is an effect on the current business and how would the business need to be adapting to change in the market and world, and example test benchmarking for consumer marketed of newer devices like the internet of things devices equipped with new edge computing devices will be increase efficiency and reducing posing a risk from a current and looming crisis. Throughout the whole paper, we will explain the technologies that lead the present technologies and the current situation why these technologies will be innovations that change the traditional practice through brief introductions to the technologies such as cloud computing, edge computing, Internet of Things and how it will be leading into future.

Keywords: internet of things, edge computing, machine learning, pattern recognition, image classification

Procedia PDF Downloads 125
1312 Jurisdictional Issues between Competition Law and Data Protection Law in Protection of Privacy of Online Consumers

Authors: Pankhudi Khandelwal

Abstract:

The revenue models of digital giants such as Facebook and Google, use targeted advertising for revenues. Such a model requires huge amounts of consumer data. While the data protection law deals with the protection of personal data, however, this data is acquired by the companies on the basis of consent, performance of a contract, or legitimate interests. This paper analyses the role that competition law can play in evading these loopholes for the protection of data and privacy of online consumers. Digital markets have certain distinctive features such as network effects and feedback loop, which gives incumbents of these markets a first-mover advantage. This creates a situation where the winner takes it all, thus creating entry barriers and concentration in the market. It has been also seen that this dominant position is then used by the undertakings for leveraging in other markets. This can be harmful to the consumers in form of less privacy, less choice, and stifling innovation, as seen in the cases of Facebook Cambridge Analytica, Google Shopping, and Google Android. Therefore, the article aims to provide a legal framework wherein the data protection law and competition law can come together to provide a balance in regulating digital markets. The issue has become more relevant in light of the Facebook decision by German competition authority, where it was held that Facebook had abused its dominant position by not complying with data protection rules, which constituted an exploitative practice. The paper looks into the jurisdictional boundaries that the data protection and competition authorities can work from and suggests ex ante regulation through data protection law and ex post regulation through competition law. It further suggests a change in the consumer welfare standard where harm to privacy should be considered as an indicator of low quality.

Keywords: data protection, dominance, ex ante regulation, ex post regulation

Procedia PDF Downloads 135
1311 Machine Learning Techniques to Predict Cyberbullying and Improve Social Work Interventions

Authors: Oscar E. Cariceo, Claudia V. Casal

Abstract:

Machine learning offers a set of techniques to promote social work interventions and can lead to support decisions of practitioners in order to predict new behaviors based on data produced by the organizations, services agencies, users, clients or individuals. Machine learning techniques include a set of generalizable algorithms that are data-driven, which means that rules and solutions are derived by examining data, based on the patterns that are present within any data set. In other words, the goal of machine learning is teaching computers through 'examples', by training data to test specifics hypothesis and predict what would be a certain outcome, based on a current scenario and improve that experience. Machine learning can be classified into two general categories depending on the nature of the problem that this technique needs to tackle. First, supervised learning involves a dataset that is already known in terms of their output. Supervising learning problems are categorized, into regression problems, which involve a prediction from quantitative variables, using a continuous function; and classification problems, which seek predict results from discrete qualitative variables. For social work research, machine learning generates predictions as a key element to improving social interventions on complex social issues by providing better inference from data and establishing more precise estimated effects, for example in services that seek to improve their outcomes. This paper exposes the results of a classification algorithm to predict cyberbullying among adolescents. Data were retrieved from the National Polyvictimization Survey conducted by the government of Chile in 2017. A logistic regression model was created to predict if an adolescent would experience cyberbullying based on the interaction and behavior of gender, age, grade, type of school, and self-esteem sentiments. The model can predict with an accuracy of 59.8% if an adolescent will suffer cyberbullying. These results can help to promote programs to avoid cyberbullying at schools and improve evidence based practice.

Keywords: cyberbullying, evidence based practice, machine learning, social work research

Procedia PDF Downloads 141
1310 International Comparative Study of International Financial Reporting Standards Adoption and Earnings Quality: Effects of Differences in Accounting Standards, Industry Category, and Country Characteristics

Authors: Ichiro Mukai

Abstract:

The purpose of this study is to investigate whether firms applying International Financial Reporting Standards (IFRS), provide high-quality and comparable earnings information that is useful for decision making of information users relative to firms applying local Generally Accepted Accounting Principles (GAAP). Focus is placed on the earnings quality of listed firms in several developed countries: Australia, Canada, France, Germany, Japan, the United Kingdom (UK), and the United States (US). Except for Japan and the US, the adoption of IFRS is mandatory for listed firms in these countries. In Japan, the application of IFRS is allowed for specific listed firms. In the US, the foreign firms listed on the US securities market are permitted to apply IFRS but the listed domestic firms are prohibited from doing so. In this paper, the differences in earnings quality are compared between firms applying local GAAP and those applying IFRS in each country and industry category, and the reasons of differences in earnings quality are analyzed using various factors. The results show that, although the earnings quality of firms applying IFRS is higher than that of firms applying local GAAP, this varies with country and industry category. Thus, even if a single set of global accounting standards is used for all listed firms worldwide, it is difficult to establish comparability of financial information among global firms. These findings imply that various circumstances surrounding firms, industries, and countries etc. influence business operations and affect the differences in earnings quality.

Keywords: accruals, earnings quality, IFRS, information comparability

Procedia PDF Downloads 141
1309 An Investigation into the Crystallization Tendency/Kinetics of Amorphous Active Pharmaceutical Ingredients: A Case Study with Dipyridamole and Cinnarizine

Authors: Shrawan Baghel, Helen Cathcart, Biall J. O'Reilly

Abstract:

Amorphous drug formulations have great potential to enhance solubility and thus bioavailability of BCS class II drugs. However, the higher free energy and molecular mobility of the amorphous form lowers the activation energy barrier for crystallization and thermodynamically drives it towards the crystalline state which makes them unstable. Accurate determination of the crystallization tendency/kinetics is the key to the successful design and development of such systems. In this study, dipyridamole (DPM) and cinnarizine (CNZ) has been selected as model compounds. Thermodynamic fragility (m_T) is measured from the heat capacity change at the glass transition temperature (Tg) whereas dynamic fragility (m_D) is evaluated using methods based on extrapolation of configurational entropy to zero 〖(m〗_(D_CE )), and heating rate dependence of Tg 〖(m〗_(D_Tg)). The mean relaxation time of amorphous drugs was calculated from Vogel-Tammann-Fulcher (VTF) equation. Furthermore, the correlation between fragility and glass forming ability (GFA) of model drugs has been established and the relevance of these parameters to crystallization of amorphous drugs is also assessed. Moreover, the crystallization kinetics of model drugs under isothermal conditions has been studied using Johnson-Mehl-Avrami (JMA) approach to determine the Avrami constant ‘n’ which provides an insight into the mechanism of crystallization. To further probe into the crystallization mechanism, the non-isothermal crystallization kinetics of model systems was also analysed by statistically fitting the crystallization data to 15 different kinetic models and the relevance of model-free kinetic approach has been established. In addition, the crystallization mechanism for DPM and CNZ at each extent of transformation has been predicted. The calculated fragility, glass forming ability (GFA) and crystallization kinetics is found to be in good correlation with the stability prediction of amorphous solid dispersions. Thus, this research work involves a multidisciplinary approach to establish fragility, GFA and crystallization kinetics as stability predictors for amorphous drug formulations.

Keywords: amorphous, fragility, glass forming ability, molecular mobility, mean relaxation time, crystallization kinetics, stability

Procedia PDF Downloads 323
1308 Study of Mechanical Properties of Glutarylated Jute Fiber Reinforced Epoxy Composites

Authors: V. Manush Nandan, K. Lokdeep, R. Vimal, K. Hari Hara Subramanyan, C. Aswin, V. Logeswaran

Abstract:

Natural fibers have attained the potential market in the composite industry because of the huge environmental impact caused by synthetic fibers. Among the natural fibers, jute fibers are the most abundant plant fibers which are manufactured mainly in countries like India. Even though there is a good motive to utilize the natural supplement, the strength of the natural fiber composites is still a topic of discussion. In recent days, many researchers are showing interest in the chemical modification of the natural fibers to increase various mechanical and thermal properties. In the present study, jute fibers have been modified chemically using glutaric anhydride at different concentrations of 5%, 10%, 20%, and 30%. The glutaric anhydride solution is prepared by dissolving the different quantity of glutaric anhydride in benzene and dimethyl-sulfoxide using sodium formate catalyst. The jute fiber mats have been treated by the method of retting at various time intervals of 3, 6, 12, 24, and 36 hours. The modification structure of the treated fibers has been confirmed with infrared spectroscopy. The degree of modification increases with an increase in retention time, but higher retention time has damaged the fiber structure. The unmodified fibers and glutarylated fibers at different retention times are reinforced with epoxy matrix under room temperature. The tensile strength and flexural strength of the composites are analyzed in detail. Among these, the composite made with glutarylated fiber has shown good mechanical properties when compared to those made of unmodified fiber.

Keywords: flexural properties, glutarylation, glutaric anhydride, tensile properties

Procedia PDF Downloads 159
1307 A Case Study on Indian Translation Ecosystem of Point-Of-Care Solutions

Authors: Tripta Dixit, Smita Sahu, William Selvamurthy, Sadhana Srivastava

Abstract:

The translation of healthcare technologies is an expensive, complex affair, current healthcare challenges in Asian countries and their efforts to meet Millennium Development Goals (MDGs), necessitates continuous technology advancement to save countless lives, improve the quality of life and for socio-economic development. India’s consistently improving global innovation index (57) demonstrates its innovation potential, but access to health care is asymmetric and lacks priority in India. Therefore, there is utmost need of a robust translation system for point-of-care (POC) solutions, inexpensive, low-maintenance, reliable, and easy-to-use diagnostic technologies. Few cases of POC technologies viz. Elisa based diagnostic kits for regional viral disease, a device for detection of cancerous lesions were studied to understand the process and challenges involved in their translation. Accordingly, the entire translation ecosystem was summarized proposing a nexus of various actors such as technology developer, technology transferor technology receiver, funding entities, government/regulatory bodies and their effect on translation of different medical technologies. This study highlights the role and concerns pertaining to these actors for POC such as unsystematic and unvalidated research roadmap, low profit preposition, unfocused approach of up-scaling, low market acceptability and multiple window regulatory framework, etc. This provides an opportunity to devise solutions to overcome problem areas in translation path.

Keywords: healthcare technologies, point-of-care solutions, public health, translation

Procedia PDF Downloads 140
1306 User-Perceived Quality Factors for Certification Model of Web-Based System

Authors: Jamaiah H. Yahaya, Aziz Deraman, Abdul Razak Hamdan, Yusmadi Yah Jusoh

Abstract:

One of the most essential issues in software products is to maintain it relevancy to the dynamics of the user’s requirements and expectation. Many studies have been carried out in quality aspect of software products to overcome these problems. Previous software quality assessment models and metrics have been introduced with strengths and limitations. In order to enhance the assurance and buoyancy of the software products, certification models have been introduced and developed. From our previous experiences in certification exercises and case studies collaborating with several agencies in Malaysia, the requirements for user based software certification approach is identified and demanded. The emergence of social network applications, the new development approach such as agile method and other varieties of software in the market have led to the domination of users over the software. As software become more accessible to the public through internet applications, users are becoming more critical in the quality of the services provided by the software. There are several categories of users in web-based systems with different interests and perspectives. The classifications and metrics are identified through brain storming approach with includes researchers, users and experts in this area. The new paradigm in software quality assessment is the main focus in our research. This paper discusses the classifications of users in web-based software system assessment and their associated factors and metrics for quality measurement. The quality model is derived based on IEEE structure and FCM model. The developments are beneficial and valuable to overcome the constraints and improve the application of software certification model in future.

Keywords: software certification model, user centric approach, software quality factors, metrics and measurements, web-based system

Procedia PDF Downloads 375
1305 Preliminary Flow Sheet for Recycling of Spent Lithium-Ion Batteries

Authors: Mohammad Ali Rajaeifar, Oliver Heidrich

Abstract:

Nowadays, Li-ion batteries are vastly disseminated and the battery market is expected to experience a huge growth during next decade especially in terms of traction batteries. As the automotive industry moving towards the electrification of the powertrain, more raw/critical materials and energy are extracted while on the other hand, concerns are made regarding the scarcity of the materials as well as environmental issues regarding the destiny of the spent batteries. In this regards, recycling could play a vital role in the supply chain, leading reutilization of key battery materials and also reducing environmental burden related to the use of batteries. The aim of this paper is to review the previous and state-of-the-art treatments for recycling of Li-ion batteries. All the treatments method from mechanical, mild-thermal, pyrometallurgical and hydrometallurgical as well as combined methods for recycling of Li-ion batteries were considered in the study. There are various treatment methods that are economical, but they are not environmentally friendly or vice versa. This is due to the fact that the benefits of the Li-ion batteries recycling could be affected by different factors such as the amount of spent batteries available, the quality of the recovered material, the energy and material consumption by the process itself and environmental burdens caused by required logistics. Finally, a preliminary work sheet of possible route for recycling of spent Li-ion batteries was presented through the course of this study. Overall, it is worth quoting that recycling processes generally consumes a great deal of energy and auxiliary materials. Moreover, the collection of spent products from waste streams represents additional environmental efforts. Therefore, developing and optimizing efficient collection and separation technologies is essential to achieve sustainability goals.

Keywords: hydrometallurgical treatment, Li-ion batteries, mild-thermal treatment, mechanical treatment, recycling, pyrometallurgical treatment

Procedia PDF Downloads 83
1304 Aerodynamic Optimization of Oblique Biplane by Using Supercritical Airfoil

Authors: Asma Abdullah, Awais Khan, Reem Al-Ghumlasi, Pritam Kumari, Yasir Nawaz

Abstract:

Introduction: This study verified the potential applications of two Oblique Wing configurations that were initiated by the Germans Aerodynamicists during the WWII. Due to the end of the war, this project was not completed and in this research is targeting the revival of German Oblique biplane configuration. The research draws upon the use of two Oblique wings mounted on the top and bottom of the fuselage through a single pivot. The wings are capable of sweeping at different angles ranging from 0° at takeoff to 60° at cruising Altitude. The top wing, right half, behaves like a forward swept wing and the left half, behaves like a backward swept wing. Vice Versa applies to the lower wing. This opposite deflection of the top and lower wing cancel out the rotary moment created by each wing and the aircraft remains stable. Problem to better understand or solve: The purpose of this research is to investigate the potential of achieving improved aerodynamic performance and efficiency of flight at a wide range of sweep angles. This will help examine the most accurate value for the sweep angle at which the aircraft will possess both stability and better aerodynamics. Explaining the methods used: The Aircraft configuration is designed using Solidworks after which a series of Aerodynamic prediction are conducted, both in the subsonic and the supersonic flow regime. Computations are carried on Ansys Fluent. The results are then compared to theoretical and flight data of different Supersonic fighter aircraft of the same category (AD-1) and with the Wind tunnel testing model at subsonic speed. Results: At zero sweep angle, the aircraft has an excellent lift coefficient value with almost double that found for fighter jets. In acquiring of supersonic speed the sweep angle is increased to maximum 60 degrees depending on the mission profile. General findings: Oblique biplane can be the future fighter jet aircraft because of its high value performance in terms of aerodynamics, cost, structural design and weight.

Keywords: biplane, oblique wing, sweep angle, supercritical airfoil

Procedia PDF Downloads 249
1303 Forensic Methods Used for the Verification of the Authenticity of Prints

Authors: Olivia Rybak-Karkosz

Abstract:

This paper aims to present the results of scientific research on methods of forging art prints and their elements, such as signature or provenance and forensic science methods that might be used to verify their authenticity. In the last decades, the art market has observed significant interest in purchasing prints. They are considered an economical alternative to paintings and a considerable investment. However, the authenticity of an art print is difficult to establish as similar visual effects might be achieved with drawings or xerox. The latter is easy to make using a home printer. They are then offered on flea markets or internet auctions as genuine prints. This probable ease of forgery and, at the same time, the difficulty of distinguishing art print techniques were the main reasons why this research was undertaken. A lack of scientific methods dedicated to disclosing a forgery encouraged the author to verify the possibility of using forensic science's methods known and used in other fields of expertise. This research methodology consisted of completing representative forgery samples collected in selected museums based in Poland and a few in Germany and Austria. That allowed the author to present a typology of methods used to forge art prints. Given that one of the most famous graphic design examples is bills and securities, it seems only appropriate to propose in print verification the usage of methods of detecting counterfeit currency. These methods contain an examination of ink, paper, and watermarks. On prints, additionally, signatures and imprints of stamps, etc., are forged as well. So the examination should be completed with handwriting examination and forensic sphragistics. The paper contains a stipulation to conduct a complex analysis of authenticity with the participation of an art restorer, art historian, and forensic expert as head of this team.

Keywords: art forgery, examination of an artwork, handwriting analysis, prints

Procedia PDF Downloads 98
1302 Exploring Alignability Effects and the Role of Information Structure in Promoting Uptake of Energy Efficient Technologies

Authors: Rebecca Hafner, David Elmes, Daniel Read

Abstract:

The current research applies decision-making theory to the problem of increasing uptake of energy efficient technologies in the market place, where uptake is currently slower than one might predict following rational choice models. We apply the alignable/non-alignable features effect and explore the impact of varying information structure on the consumers’ preference for standard versus energy efficient technologies. In two studies we present participants with a choice between similar (boiler vs. boiler) vs. dissimilar (boiler vs. heat pump) technologies, described by a list of alignable and non-alignable attributes. In study One there is a preference for alignability when options are similar; an effect mediated by an increased tendency to infer missing information is the same. No effects of alignability on preference are found when options differ. One explanation for this split-shift in attentional focus is a change in construal levels potentially induced by the added consideration of environmental concern. Study two was designed to explore the interplay between alignability and construal level in greater detail. We manipulated construal level via a thought prime task prior to taking part in the same heating systems choice task, and find that there is a general preference for non-alignability, regardless of option type. We draw theoretical and applied implications for the type of information structure best suited for the promotion of energy efficient technologies.

Keywords: alignability effects, decision making, energy-efficient technologies, sustainable behaviour change

Procedia PDF Downloads 285
1301 A Systematic Review on Development of a Cost Estimation Framework: A Case Study of Nigeria

Authors: Babatunde Dosumu, Obuks Ejohwomu, Akilu Yunusa-Kaltungo

Abstract:

Cost estimation in construction is often difficult, particularly when dealing with risks and uncertainties, which are inevitable and peculiar to developing countries like Nigeria. Direct consequences of these are major deviations in cost, duration, and quality. The fundamental aim of this study is to develop a framework for assessing the impacts of risk on cost estimation, which in turn causes variabilities between contract sum and final account. This is very important, as initial estimates given to clients should reflect the certain magnitude of consistency and accuracy, which the client builds other planning-related activities upon, and also enhance the capabilities of construction industry professionals by enabling better prediction of the final account from the contract sum. In achieving this, a systematic literature review was conducted with cost variability and construction projects as search string within three databases: Scopus, Web of science, and Ebsco (Business source premium), which are further analyzed and gap(s) in knowledge or research discovered. From the extensive review, it was found that factors causing deviation between final accounts and contract sum ranged between 1 and 45. Besides, it was discovered that a cost estimation framework similar to Building Cost Information Services (BCIS) is unavailable in Nigeria, which is a major reason why initial estimates are very often inconsistent, leading to project delay, abandonment, or determination at the expense of the huge sum of money invested. It was concluded that the development of a cost estimation framework that is adjudged an important tool in risk shedding rather than risk-sharing in project risk management would be a panacea to cost estimation problems, leading to cost variability in the Nigerian construction industry by the time this ongoing Ph.D. research is completed. It was recommended that practitioners in the construction industry should always take into account risk in order to facilitate the rapid development of the construction industry in Nigeria, which should give stakeholders a more in-depth understanding of the estimation effectiveness and efficiency to be adopted by stakeholders in both the private and public sectors.

Keywords: cost variability, construction projects, future studies, Nigeria

Procedia PDF Downloads 167
1300 Exploring the Applications of Neural Networks in the Adaptive Learning Environment

Authors: Baladitya Swaika, Rahul Khatry

Abstract:

Computer Adaptive Tests (CATs) is one of the most efficient ways for testing the cognitive abilities of students. CATs are based on Item Response Theory (IRT) which is based on item selection and ability estimation using statistical methods of maximum information selection/selection from posterior and maximum-likelihood (ML)/maximum a posteriori (MAP) estimators respectively. This study aims at combining both classical and Bayesian approaches to IRT to create a dataset which is then fed to a neural network which automates the process of ability estimation and then comparing it to traditional CAT models designed using IRT. This study uses python as the base coding language, pymc for statistical modelling of the IRT and scikit-learn for neural network implementations. On creation of the model and on comparison, it is found that the Neural Network based model performs 7-10% worse than the IRT model for score estimations. Although performing poorly, compared to the IRT model, the neural network model can be beneficially used in back-ends for reducing time complexity as the IRT model would have to re-calculate the ability every-time it gets a request whereas the prediction from a neural network could be done in a single step for an existing trained Regressor. This study also proposes a new kind of framework whereby the neural network model could be used to incorporate feature sets, other than the normal IRT feature set and use a neural network’s capacity of learning unknown functions to give rise to better CAT models. Categorical features like test type, etc. could be learnt and incorporated in IRT functions with the help of techniques like logistic regression and can be used to learn functions and expressed as models which may not be trivial to be expressed via equations. This kind of a framework, when implemented would be highly advantageous in psychometrics and cognitive assessments. This study gives a brief overview as to how neural networks can be used in adaptive testing, not only by reducing time-complexity but also by being able to incorporate newer and better datasets which would eventually lead to higher quality testing.

Keywords: computer adaptive tests, item response theory, machine learning, neural networks

Procedia PDF Downloads 153
1299 Changing Skills with the Transformation of Procurement Function

Authors: Ömer Faruk Ada, Türker Baş, M. Yaman Öztek

Abstract:

In this study, we aim to investigate the skills to be owned by procurement professionals in order to fulfill their developing and changing role completely. Market conditions, competitive pressure, and high financial costs make it more important than ever for organizations to be able to use resources more efficiently. Research shows that procurement expenses consist more than 50 % of the operating expenses. With increasing profit impact of procurement, reviewing the position of the procurement function within the organization has become inevitable. This study is significant as it indicates the necessary skills that procurement professionals must have to keep in step with the transformation of procurement units from transaction oriented to value chain oriented. In this study, the transformation of procurement is investigated from the perspective of procurement professionals and we aim to answer following research questions: • How do procurement professionals perceive their role within the organization? • How has their role changed and what challenges have they had to face? • What portfolio of skills do they believe will enable them to fulfill their role effectively? Literature review consists of the first part of the study by investigating the changing role of procurement from different perspectives. In the second part, we present the results of the in-depth interviews with 15 procurement professionals and we used descriptive analysis as a methodology. In the light of these results, we classified procurement skills under operational, tactical and strategic levels and Procurement Skills Framework has been developed. This study shows the differences in the perception of purchasing by professionals and the organizations. The differences in the perception are considered as an important barrier beyond the procurement transformation. Although having the necessary skills has a significant effect for procurement professionals to fulfill their role completely and keep in step with the transformation of the procurement function, It is not the only factor and the degree of high-level management and organizational support has also a direct impact during this transformation.

Keywords: procuement skills, procurement transformation, strategic procurement, value chain

Procedia PDF Downloads 391
1298 Electrochemical Bioassay for Haptoglobin Quantification: Application in Bovine Mastitis Diagnosis

Authors: Soledad Carinelli, Iñigo Fernández, José Luis González-Mora, Pedro A. Salazar-Carballo

Abstract:

Mastitis is the most relevant inflammatory disease in cattle, affecting the animal health and causing important economic losses on dairy farms. This disease takes place in the mammary gland or udder when some opportunistic microorganisms, such as Staphylococcus aureus, Streptococcus agalactiae, Corynebacterium bovis, etc., invade the teat canal. According to the severity of the inflammation, mastitis can be classified as sub-clinical, clinical and chronic. Standard methods for mastitis detection include counts of somatic cells, cell culture, electrical conductivity of the milk, and California test (evaluation of “gel-like” matrix consistency after cell lysed with detergents). However, these assays present some limitations for accurate detection of subclinical mastitis. Currently, haptoglobin, an acute phase protein, has been proposed as novel and effective biomarker for mastitis detection. In this work, an electrochemical biosensor based on polydopamine-modified magnetic nanoparticles (MNPs@pDA) for haptoglobin detection is reported. Thus, MNPs@pDA has been synthesized by our group and functionalized with hemoglobin due to its high affinity to haptoglobin protein. The protein was labeled with specific antibodies modified with alkaline phosphatase enzyme for its electrochemical detection using an electroactive substrate (1-naphthyl phosphate) by differential pulse voltammetry. After the optimization of assay parameters, the haptoglobin determination was evaluated in milk. The strategy presented in this work shows a wide range of detection, achieving a limit of detection of 43 ng/mL. The accuracy of the strategy was determined by recovery assays, being of 84 and 94.5% for two Hp levels around the cut off value. Milk real samples were tested and the prediction capacity of the electrochemical biosensor was compared with a Haptoglobin commercial ELISA kit. The performance of the assay has demonstrated this strategy is an excellent and real alternative as screen method for sub-clinical bovine mastitis detection.

Keywords: bovine mastitis, haptoglobin, electrochemistry, magnetic nanoparticles, polydopamine

Procedia PDF Downloads 132
1297 Modelling Dengue Disease With Climate Variables Using Geospatial Data For Mekong River Delta Region of Vietnam

Authors: Thi Thanh Nga Pham, Damien Philippon, Alexis Drogoul, Thi Thu Thuy Nguyen, Tien Cong Nguyen

Abstract:

Mekong River Delta region of Vietnam is recognized as one of the most vulnerable to climate change due to flooding and seawater rise and therefore an increased burden of climate change-related diseases. Changes in temperature and precipitation are likely to alter the incidence and distribution of vector-borne diseases such as dengue fever. In this region, the peak of the dengue epidemic period is around July to September during the rainy season. It is believed that climate is an important factor for dengue transmission. This study aims to enhance the capacity of dengue prediction by the relationship of dengue incidences with climate and environmental variables for Mekong River Delta of Vietnam during 2005-2015. Mathematical models for vector-host infectious disease, including larva, mosquito, and human being were used to calculate the impacts of climate to the dengue transmission with incorporating geospatial data for model input. Monthly dengue incidence data were collected at provincial level. Precipitation data were extracted from satellite observations of GSMaP (Global Satellite Mapping of Precipitation), land surface temperature and land cover data were from MODIS. The value of seasonal reproduction number was estimated to evaluate the potential, severity and persistence of dengue infection, while the final infected number was derived to check the outbreak of dengue. The result shows that the dengue infection depends on the seasonal variation of climate variables with the peak during the rainy season and predicted dengue incidence follows well with this dynamic for the whole studied region. However, the highest outbreak of 2007 dengue was not captured by the model reflecting nonlinear dependences of transmission on climate. Other possible effects will be discussed to address the limitation of the model. This suggested the need of considering of both climate variables and another variability across temporal and spatial scales.

Keywords: infectious disease, dengue, geospatial data, climate

Procedia PDF Downloads 346
1296 FRATSAN: A New Software for Fractal Analysis of Signals

Authors: Hamidreza Namazi

Abstract:

Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.

Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 424
1295 Thailand and Sino-Japanese Relations in the Early Twentieth Century

Authors: Mizuno Norihito

Abstract:

This study attempts to examine Japanese views of Thailand primarily in the 1920s and 1930s through the analysis of documents published by the Office of Governor-General of Taiwan (Taiwan Sotokufu) and its affiliated organizations. Japan regarded Taiwan, under its control since 1895, as a foothold to making inroads into the South, and The governor-general office was active in investigations and intelligence gathering in Southeast Asia, as well as the southern part of the Chinese continent. Documents published by the governor-general office and its related organizations, especially those in a couple of decades following the First World War, reveal that the Japanese paid attention to the presence of the-Thai-Chinese during the time period. It would not be surprising that the desiring to penetrating into the Thai market, as well as the markets of the rest of Southeast Asia, the Japanese could not ignore the presence of the Thai-Chinese because of their local economic influences. The increased Japanese concern about the Thai-Chinese toward the end of the 1920s and throughout the 1930s was, moreover, intertwined with the increased tension between China and Japan. In other words, Thailand, as well as the rest of Southeast Asia, became another arena of Sino-Japanese confrontation. The rise of nationalism against Japan in China infected the Thai-Chinese communities and threatened Japanese economic activities in the country. However, the Japanese simultaneously found out that Thai-Chinese concert with anti-Japanese in China did not necessarily match their business interests and that the Thai government’s efforts to assimilate the Thai-Chinese into the Thais society and strategic approach to Japan in the late 1930s hampered their anti-Japanese actions.

Keywords: Japanese-Thai Relations, Sino-Japanese relations, Thai Chinese, Overseas Japanese

Procedia PDF Downloads 307