Search results for: uncertainty quantification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1434

Search results for: uncertainty quantification

474 The Term of Intellectual Property and Artificial Intelligence

Authors: Yusuf Turan

Abstract:

Definition of Intellectual Property Rights according to the World Intellectual Property Organization: " Intellectual property (IP) refers to creations of the mind, such as inventions; literary and artistic works; designs; and symbols, names and images used in commerce." It states as follows. There are 2 important points in the definition; we can say that it is the result of intellectual activities that occur by one or more than one PERSON and as INNOVATION. When the history and development of the relevant definitions are briefly examined, it is realized that these two points have remained constant and Intellectual Property law and rights have been shaped around these two points. With the expansion of the scope of the term Intellectual Property as a result of the development of technology, especially in the field of artificial intelligence, questions such as "Can "Artificial Intelligence" be an inventor?" need to be resolved within the expanding scope. In the past years, it was ruled that the artificial intelligence named DABUS seen in the USA did not meet the definition of "individual" and therefore would be an inventor/inventor. With the developing technology, it is obvious that we will encounter such situations much more frequently in the field of intellectual property. While expanding the scope, we should definitely determine the boundaries of how we should decide who performs the mental activity or creativity that we call indispensable on the inventor/inventor according to these problems. As a result of all these problems and innovative situations, it is clearly realized that not only Intellectual Property Law and Rights but also their definitions need to be updated and improved. Ignoring the situations that are outside the scope of the current Intellectual Property Term is not enough to solve the problem and brings uncertainty. The fact that laws and definitions that have been operating on the same theories for years exclude today's innovative technologies from the scope contradicts intellectual property, which is expressed as a new and innovative field. Today, as a result of the innovative creation of poetry, painting, animation, music and even theater works with artificial intelligence, it must be recognized that the definition of Intellectual Property must be revised.

Keywords: artificial intelligence, innovation, the term of intellectual property, right

Procedia PDF Downloads 58
473 Quantification of the Erosion Effect on Small Caliber Guns: Experimental and Numerical Analysis

Authors: Dhouibi Mohamed, Stirbu Bogdan, Chabotier André, Pirlot Marc

Abstract:

Effects of erosion and wear on the performance of small caliber guns have been analyzed throughout numerical and experimental studies. Mainly, qualitative observations were performed. Correlations between the volume change of the chamber and the maximum pressure are limited. This paper focuses on the development of a numerical model to predict the maximum pressure evolution when the interior shape of the chamber changes in the different weapon’s life phases. To fulfill this goal, an experimental campaign, followed by a numerical simulation study, is carried out. Two test barrels, « 5.56x45mm NATO » and « 7.62x51mm NATO,» are considered. First, a Coordinate Measuring Machine (CMM) with a contact scanning probe is used to measure the interior profile of the barrels after each 300-shots cycle until their worn out. Simultaneously, the EPVAT (Electronic Pressure Velocity and Action Time) method with a special WEIBEL radar are used to measure: (i) the chamber pressure, (ii) the action time, (iii) and the bullet velocity in each barrel. Second, a numerical simulation study is carried out. Thus, a coupled interior ballistic model is developed using the dynamic finite element program LS-DYNA. In this work, two different models are elaborated: (i) coupled Eularien Lagrangian method using fluid-structure interaction (FSI) techniques and a coupled thermo-mechanical finite element using a lumped parameter model (LPM) as a subroutine. Those numerical models are validated and checked through three experimental results, such as (i) the muzzle velocity, (ii) the chamber pressure, and (iii) the surface morphology of fired projectiles. Results show a good agreement between experiments and numerical simulations. Next, a comparison between the two models is conducted. The projectile motions, the dynamic engraving resistances and the maximum pressures are compared and analyzed. Finally, using this obtained database, a statistical correlation between the muzzle velocity, the maximum pressure and the chamber volume is established.

Keywords: engraving process, finite element analysis, gun barrel erosion, interior ballistics, statistical correlation

Procedia PDF Downloads 198
472 [Keynote Talk] The Practices and Issues of Career Education: Focusing on Career Development Course on Various Problems of Society

Authors: Azusa Katsumata

Abstract:

Several universities in Japan have introduced activities aimed at the mutual enlightenment of a diversity of people in career education. However, several programs emphasize on delivering results, and on practicing the prepared materials as planned. Few programs focus on unexpected failures and setbacks. This way of learning is important in career education so that classmates can help each other, overcome difficulties, draw out each other’s strengths, and learn from them. Seijo University in Tokyo offered excursion focusing Various Problems of Society, as second year career education course, Students will learn about contraception, infertility, homeless people, LGBT, and they will discuss based on the excursion. This paper aims to study the ‘learning platform’ created by a series of processes such as the excursion, the discussion, and the presentation. In this course, students looked back on their lives and imagined the future in concrete terms, performing tasks in groups. The students came across a range of values through lectures and conversations, thereby developing feelings of self-efficacy. We conducted a questionnaire to measure the development of career in class. From the results of the questionnaire, we can see, in the example of this class, that students respected diversity and understood the importance of uncertainty and discontinuity. Whereas the students developed career awareness, they actually did not come across that scene and would do so only in the future when it became necessary. In this class, students consciously considered social problems, but did not develop the practical skills necessary to deal with these. This is appropriate for one of project, but we need to consider how this can be incorporated into future courses. University constitutes only a single period in life-long career formation. Thus, further research may be indicated to determine whether the positive effects of career education at university continue to contribute to individual careers going forward.

Keywords: career education of university, excursion, learning platform, problems of society

Procedia PDF Downloads 253
471 Time Pressure and Its Effect at Tactical Level of Disaster Management

Authors: Agoston Restas

Abstract:

Introduction: In case of managing disasters decision makers can face many times such a special situation where any pre-sign of the drastically change is missing therefore the improvised decision making can be required. The complexity, ambiguity, uncertainty or the volatility of the situation can require many times the improvisation as decision making. It can be taken at any level of the management (strategic, operational and tactical) but at tactical level the main reason of the improvisation is surely time pressure. It is certainly the biggest problem during the management. Methods: The author used different tools and methods to achieve his goals; one of them was the study of the relevant literature, the other one was his own experience as a firefighting manager. Other results come from two surveys that are referred to; one of them was an essay analysis, the second one was a word association test, specially created for the research. Results and discussion: This article proves that, in certain situations, the multi-criteria, evaluating decision-making processes simply cannot be used or only in a limited manner. However, it can be seen that managers, directors or commanders are many times in situations that simply cannot be ignored when making decisions which should be made in a short time. The functional background of decisions made in a short time, their mechanism, which is different from the conventional, was studied lately and this special decision procedure was given the name recognition-primed decision. In the article, author illustrates the limits of the possibilities of analytical decision-making, presents the general operating mechanism of recognition-primed decision-making, elaborates on its special model relevant to managers at tactical level, as well as explore and systemize the factors that facilitate (catalyze) the processes with an example with fire managers.

Keywords: decision making, disaster managers, recognition primed decision, model for making decisions in emergencies

Procedia PDF Downloads 249
470 Development of a Geomechanical Risk Assessment Model for Underground Openings

Authors: Ali Mortazavi

Abstract:

The main objective of this research project is to delve into a multitude of geomechanical risks associated with various mining methods employed within the underground mining industry. Controlling geotechnical design parameters and operational factors affecting the selection of suitable mining techniques for a given underground mining condition will be considered from a risk assessment point of view. Important geomechanical challenges will be investigated as appropriate and relevant to the commonly used underground mining methods. Given the complicated nature of rock mass in-situ and complicated boundary conditions and operational complexities associated with various underground mining methods, the selection of a safe and economic mining operation is of paramount significance. Rock failure at varying scales within the underground mining openings is always a threat to mining operations and causes human and capital losses worldwide. Geotechnical design is a major design component of all underground mines and basically dominates the safety of an underground mine. With regard to uncertainties that exist in rock characterization prior to mine development, there are always risks associated with inappropriate design as a function of mining conditions and the selected mining method. Uncertainty often results from the inherent variability of rock masse, which in turn is a function of both geological materials and rock mass in-situ conditions. The focus of this research is on developing a methodology which enables a geomechanical risk assessment of given underground mining conditions. The outcome of this research is a geotechnical risk analysis algorithm, which can be used as an aid in selecting the appropriate mining method as a function of mine design parameters (e.g., rock in-situ properties, design method, governing boundary conditions such as in-situ stress and groundwater, etc.).

Keywords: geomechanical risk assessment, rock mechanics, underground mining, rock engineering

Procedia PDF Downloads 134
469 Water Governance Perspectives on the Urmia Lake Restoration Process: Challenges and Achievements

Authors: Jalil Salimi, Mandana Asadi, Naser Fathi

Abstract:

Urmia Lake (UL) has undergone a significant decline in water levels, resulting in severe environmental, socioeconomic, and health-related challenges. This paper examines the restoration process of UL from a water governance perspective. By applying a water governance model, the study evaluates the process based on six selected principles: stakeholder engagement, transparency and accountability, effectiveness, equitable water use, adaptation capacity, and water usage efficiency. The dominance of structural and physicalist approaches to water governance has led to a weak understanding of social and environmental issues, contributing to social crises. Urgent efforts are required to address the water crisis and reform water governance in the country, making water-related issues a top national priority. The UL restoration process has achieved significant milestones, including stakeholder consensus, scientific and participatory planning, environmental vision, intergenerational justice considerations, improved institutional environment for NGOs, investments in water infrastructure, transparency promotion, environmental effectiveness, and local issue resolutions. However, challenges remain, such as power distribution imbalances, bureaucratic administration, weak conflict resolution mechanisms, financial constraints, accountability issues, limited attention to social concerns, overreliance on structural solutions, legislative shortcomings, program inflexibility, and uncertainty management weaknesses. Addressing these weaknesses and challenges is crucial for the successful restoration and sustainable governance of UL.

Keywords: evaluation, restoration process, Urmia Lake, water governance, water resource management

Procedia PDF Downloads 60
468 Treating On-Demand Bonds as Cash-In-Hand: Analyzing the Use of “Unconscionability” as a Ground for Challenging Claims for Payment under On-Demand Bonds

Authors: Asanga Gunawansa, Shenella Fonseka

Abstract:

On-demand bonds, also known as unconditional bonds, are commonplace in the construction industry as a means of safeguarding the employer from any potential non-performance by a contractor. On-demand bonds may be obtained from commercial banks, and they serve as an undertaking by the issuing bank to honour payment on demand without questioning and/or considering any dispute between the employer and the contractor in relation to the underlying contract. Thus, whether or not a breach had occurred under the underlying contract, which triggers the demand for encashment by the employer, is not a question the bank needs to be concerned with. As a result, an unconditional bond allows the beneficiary to claim the money almost without any condition. Thus, an unconditional bond is as good as cash-in-hand. In the past, establishing fraud on the part of the employer, of which the bank had knowledge, was the only ground on which a bank could dishonour a claim made under an on-demand bond. However, recent jurisprudence in common law countries shows that courts are beginning to consider unconscionable conduct on the part of the employer in claiming under an on-demand bond as a ground that contractors could rely on the prevent the banks from honouring such claims. This has created uncertainty in connection with on-demand bonds and their liquidity. This paper analyzes recent judicial decisions in four common law jurisdictions, namely, England, Singapore, Hong Kong, and Sri Lanka, to identify the scope of using the concept of “unconscionability” as a ground for preventing unreasonable claims for encashment of on-demand bonds. The objective of this paper is to argue that on-demand bonds have lost their effectiveness as “cash-in-hand” and that this is, in fact, an advantage and not an impediment to international commerce, as the purpose of such bonds should not be to provide for illegal and unconscionable conduct by the beneficiaries.

Keywords: fraud, performance guarantees, on-demand bonds, unconscionability

Procedia PDF Downloads 95
467 Age Estimation from Upper Anterior Teeth by Pulp/Tooth Ratio Using Peri-Apical X-Rays among Egyptians

Authors: Fatma Mohamed Magdy Badr El Dine, Amr Mohamed Abd Allah

Abstract:

Introduction: Age estimation of individuals is one of the crucial steps in forensic practice. Different traditional methods rely on the length of the diaphysis of long bones of limbs, epiphyseal-diaphyseal union, fusion of the primary ossification centers as well as dental eruption. However, there is a growing need for the development of precise and reliable methods to estimate age, especially in cases where dismembered corpses, burnt bodies, purified or fragmented parts are recovered. Teeth are the hardest and indestructible structure in the human body. In recent years, assessment of pulp/tooth area ratio, as an indirect quantification of secondary dentine deposition has received a considerable attention. However, scanty work has been done in Egypt in terms of applicability of pulp/tooth ratio for age estimation. Aim of the Work: The present work was designed to assess the Cameriere’s method for age estimation from pulp/tooth ratio of maxillary canines, central and lateral incisors among a sample from Egyptian population. In addition, to formulate regression equations to be used as population-based standards for age determination. Material and Methods: The present study was conducted on 270 peri-apical X-rays of maxillary canines, central and lateral incisors (collected from 131 males and 139 females aged between 19 and 52 years). The pulp and tooth areas were measured using the Adobe Photoshop software program and the pulp/tooth area ratio was computed. Linear regression equations were determined separately for canines, central and lateral incisors. Results: A significant correlation was recorded between the pulp/tooth area ratio and the chronological age. The linear regression analysis revealed a coefficient of determination (R² = 0.824 for canine, 0.588 for central incisor and 0.737 for lateral incisor teeth). Three regression equations were derived. Conclusion: As a conclusion, the pulp/tooth ratio is a useful technique for estimating age among Egyptians. Additionally, the regression equation derived from canines gave better result than the incisors.

Keywords: age determination, canines, central incisors, Egypt, lateral incisors, pulp/tooth ratio

Procedia PDF Downloads 175
466 Determining Design Parameters for Sizing of Hydronic Heating Systems in Concrete Thermally Activated Building Systems

Authors: Rahmat Ali, Inamullah Khan, Amjad Naseer, Abid A. Shah

Abstract:

Hydronic Heating and Cooling systems in concrete slab based buildings are increasingly becoming a popular substitute to conventional heating and cooling systems. In exploring the materials, techniques employed, and their relative performance measures, a fair bit of uncertainty exists. This research has identified the simplest method of determining the thermal field of a single hydronic pipe when acting as a part of a concrete slab, based on which the spacing and positioning of pipes for a best thermal performance and surface temperature control are determined. The pipe material chosen is the commonly used PEX pipe, which has an all-around performance and thermal characteristics with a thermal conductivity of 0.5W/mK. Concrete Test samples were constructed and their thermal fields tested under varying input conditions. Temperature sensing devices were embedded into the wet concrete at fixed distances from the pipe and other touch sensing temperature devices were employed for determining the extent of the thermal field and validation studies. In the first stage, it was found that the temperature along a specific distance was the same and that heat dissipation occurred in well-defined layers. The temperature obtained in concrete was then related to the different control parameters including water supply temperature. From the results, the temperature of water required for a specific temperature rise in concrete is determined. The thermally effective area is also determined which is then used to calculate the pipe spacing and positioning for the desired level of thermal comfort.

Keywords: thermally activated building systems, concrete slab temperature, thermal field, energy efficiency, thermal comfort, pipe spacing

Procedia PDF Downloads 322
465 Authentic Engagement for Institutional Leadership: Implications for Educational Policy and Planning

Authors: Simeon Adebayo Oladipo

Abstract:

Institutional administrators are currently facing pressure and challenges in their daily operations. Reasons for this may include the increasing multiplicity, uncertainty and tension that permeate institutional leadership. Authentic engagement for institutional leadership is premised on the ethical foundation that the leaders in the schools are engaged. The institutional effectiveness is dependent on the relationship that exists between the leaders and employees in the workplace. Leader’s self-awareness, relational transparency, emotional control, strong moral code and accountability have a positive influence on authentic engagement which variably determines leadership effectiveness. This study therefore examined the role of authentic engagement in effective school leadership; explored the interrelationship of authentic engagement indices in school leadership. The study adopted the descriptive research of the survey type using a quantitative method to gather data through a questionnaire among school leaders in Lagos State Tertiary Institutions. The population for the study consisted of all Heads of Departments, Deans and Principal Officers in Lagos State Tertiary Institutions. A sample size of 255 Heads of Departments, Deans and Principal Officers participated in the study. The data gathered were analyzed using descriptive and inferential statistical tools. The findings indicated that authentic engagement plays a crucial role in increasing leadership effectiveness amongst Heads of Departments, Deans and Principal Officers. The study recommended among others that there is a need for effective measures to enhance authentic engagement of institutional leadership practices through relevant educational support systems and effective quality control.

Keywords: authentic engagement, self-awareness, relational transparency, emotional control

Procedia PDF Downloads 63
464 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things

Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker

Abstract:

Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.

Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data

Procedia PDF Downloads 320
463 Sustainable Manufacturing of Concentrated Latex and Ribbed Smoked Sheets in Sri Lanka

Authors: Pasan Dunuwila, V. H. L. Rodrigo, Naohiro Goto

Abstract:

Sri Lanka is one the largest natural rubber (NR) producers of the world, where the NR industry is a major foreign exchange earner. Among the locally manufactured NR products, concentrated latex (CL) and ribbed smoked sheets (RSS) hold a significant position. Furthermore, these products become the foundation for many products utilized by the people all over the world (e.g. gloves, condoms, tires, etc.). Processing of CL and RSS costs a significant amount of material, energy, and workforce. With this background, both manufacturing lines have immensely challenged by waste, low productivity, lack of cost efficiency, rising cost of production, and many environmental issues. To face the above challenges, the adaptation of sustainable manufacturing measures that use less energy, water, materials, and produce less waste is imperative. However, these sectors lack comprehensive studies that shed light on such measures and thoroughly discuss their improvement potentials from both environmental and economic points of view. Therefore, based on a study of three CL and three RSS mills in Sri Lanka, this study deploys sustainable manufacturing techniques and tools to uncover the underlying potentials to improve performances in CL and RSS processing sectors. This study is comprised of three steps: 1. quantification of average material waste, economic losses, and greenhouse gas (GHG) emissions via material flow analysis (MFA), material flow cost accounting (MFCA), and life cycle assessment (LCA) in each manufacturing process, 2. identification of improvement options with the help of Pareto and What-if analyses, field interviews, and the existing literature; and 3. validation of the identified improvement options via the re-execution of MFA, MFCA, and LCA. With the help of this methodology, the economic and environmental hotspots, and the degrees of improvement in both systems could be identified. Results highlighted that each process could be improved to have less waste, monetary losses, manufacturing costs, and GHG emissions. Conclusively, study`s methodology and findings are believed to be beneficial for assuring the sustainable growth not only in Sri Lankan NR processing sector itself but also in NR or any other industry rooted in other developing countries.

Keywords: concentrated latex, natural rubber, ribbed smoked sheets, Sri Lanka

Procedia PDF Downloads 252
462 Aggregation of Electric Vehicles for Emergency Frequency Regulation of Two-Area Interconnected Grid

Authors: S. Agheb, G. Ledwich, G.Walker, Z.Tong

Abstract:

Frequency control has become more of concern for reliable operation of interconnected power systems due to the integration of low inertia renewable energy sources to the grid and their volatility. Also, in case of a sudden fault, the system has less time to recover before widespread blackouts. Electric Vehicles (EV)s have the potential to cooperate in the Emergency Frequency Regulation (EFR) by a nonlinear control of the power system in case of large disturbances. The time is not adequate to communicate with each individual EV on emergency cases, and thus, an aggregate model is necessary for a quick response to prevent from much frequency deviation and the occurrence of any blackout. In this work, an aggregate of EVs is modelled as a big virtual battery in each area considering various aspects of uncertainty such as the number of connected EVs and their initial State of Charge (SOC) as stochastic variables. A control law was proposed and applied to the aggregate model using Lyapunov energy function to maximize the rate of reduction of total kinetic energy in a two-area network after the occurrence of a fault. The control methods are primarily based on the charging/ discharging control of available EVs as shunt capacity in the distribution system. Three different cases were studied considering the locational aspect of the model with the virtual EV either in the center of the two areas or in the corners. The simulation results showed that EVs could help the generator lose its kinetic energy in a short time after a contingency. Earlier estimation of possible contributions of EVs can help the supervisory control level to transmit a prompt control signal to the subsystems such as the aggregator agents and the grid. Thus, the percentage of EVs contribution for EFR will be characterized in the future as the goal of this study.

Keywords: emergency frequency regulation, electric vehicle, EV, aggregation, Lyapunov energy function

Procedia PDF Downloads 91
461 Review of Downscaling Methods in Climate Change and Their Role in Hydrological Studies

Authors: Nishi Bhuvandas, P. V. Timbadiya, P. L. Patel, P. D. Porey

Abstract:

Recent perceived climate variability raises concerns with unprecedented hydrological phenomena and extremes. Distribution and circulation of the waters of the Earth become increasingly difficult to determine because of additional uncertainty related to anthropogenic emissions. According to the sixth Intergovernmental Panel on Climate Change (IPCC) Technical Paper on Climate Change and water, changes in the large-scale hydrological cycle have been related to an increase in the observed temperature over several decades. Although many previous research carried on effect of change in climate on hydrology provides a general picture of possible hydrological global change, new tools and frameworks for modelling hydrological series with nonstationary characteristics at finer scales, are required for assessing climate change impacts. Of the downscaling techniques, dynamic downscaling is usually based on the use of Regional Climate Models (RCMs), which generate finer resolution output based on atmospheric physics over a region using General Circulation Model (GCM) fields as boundary conditions. However, RCMs are not expected to capture the observed spatial precipitation extremes at a fine cell scale or at a basin scale. Statistical downscaling derives a statistical or empirical relationship between the variables simulated by the GCMs, called predictors, and station-scale hydrologic variables, called predictands. The main focus of the paper is on the need for using statistical downscaling techniques for projection of local hydrometeorological variables under climate change scenarios. The projections can be then served as a means of input source to various hydrologic models to obtain streamflow, evapotranspiration, soil moisture and other hydrological variables of interest.

Keywords: climate change, downscaling, GCM, RCM

Procedia PDF Downloads 394
460 Determination of Pesticides Residues in Tissue of Two Freshwater Fish Species by Modified QuEChERS Method

Authors: Iwona Cieślik, Władysław Migdał, Kinga Topolska, Ewa Cieślik

Abstract:

The consumption of fish is recommended as a means of preventing serious diseases, especially cardiovascular problems. Fish is known to be a valuable source of protein (rich in essential amino acids), unsaturated fatty acids, fat-soluble vitamins, macro- and microelements. However, it can also contain several contaminants (e.g. pesticides, heavy metals) that may pose considerable risks for humans. Among others, pesticide are of special concern. Their widespread use has resulted in the contamination of environmental compartments, including water. The occurrence of pesticides in the environment is a serious problem, due to their potential toxicity. Therefore, a systematic monitoring is needed. The aim of the study was to determine the organochlorine and organophosphate pesticide residues in fish muscle tissues of the pike (Esox lucius, L.) and the rainbow trout (Oncorhynchus mykkis, Walbaum) by a modified QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) method, using Gas Chromatography Quadrupole Mass Spectrometry (GC/Q-MS), working in selected-ion monitoring (SIM) mode. The analysis of α-HCH, β-HCH, lindane, diazinon, disulfoton, δ-HCH, methyl parathion, heptachlor, malathion, aldrin, parathion, heptachlor epoxide, γ-chlordane, endosulfan, α-chlordane, o,p'-DDE, dieldrin, endrin, 4,4'-DDD, ethion, endrin aldehyde, endosulfan sulfate, 4,4'-DDT, and metoxychlor was performed in the samples collected in the Carp Valley (Malopolska region, Poland). The age of the pike (n=6) was 3 years and its weight was 2-3 kg, while the age of the rainbow trout (n=6) was 0.5 year and its weight was 0.5-1.0 kg. Detectable pesticide (HCH isomers, endosulfan isomers, DDT and its metabolites as well as metoxychlor) residues were present in fish samples. However, all these compounds were below the limit of quantification (LOQ). The other examined pesticide residues were below the limit of detection (LOD). Therefore, the levels of contamination were - in all cases - below the default Maximum Residue Levels (MRLs), established by Regulation (EC) No 396/2005 of the European Parliament and of the Council. The monitoring of pesticide residues content in fish is required to minimize potential adverse effects on the environment and human exposure to these contaminants.

Keywords: contaminants, fish, pesticides residues, QuEChERS method

Procedia PDF Downloads 205
459 Privatization and Ensuring Accountability in the Provision of Essential Services: A Case of Water in South Africa

Authors: Odufu Ifakachukwu Clifford

Abstract:

Developing country governments are struggling to meet the basic needs and demands of citizens, especially so for the rural poor. With tightly constrained budgets, these governments have followed the lead of developed countries that have sought to restructure public service delivery through privatization, contracting out, public-private partnerships, and similar reforms. Such reforms in service delivery are generally welcomed when it is believed that private sector partners are better equipped to provide certain services than are governments. With respect to basic and essential services, however, a higher degree of uncertainty and apprehension exists as the focus shifts from simply minimizing the costs of delivering services to broadening access to all citizens. The constitution stipulates that everyone has the right to have access to sufficient food and water. Affordable and/or subsidized water, then, is not a privilege but a basic right of all citizens. Citizens elect political representatives to serve in office, with their sole mandate being to provide for the needs of the citizenry. As governments pass on some amount of responsibility for service delivery to private businesses, these governments must be able to exercise control in order to account to the people for the work done by private partners. This paper examines the legislative and policy frameworks as well as the environment within which PPPs take place in South Africa and the extent to which accountability can be strengthened in this environment. Within the aforementioned backdrop of PPPs and accountability, the constricted focus area of the paper aims to assess the extent to which the provision of clean and safe consumable water in South Africa is sustainable, cost-effective in terms of provision, and affordable to all.

Keywords: privatisation, accountability, essential services, government

Procedia PDF Downloads 50
458 Investigating the Feasibility of Promoting Safety in Civil Projects by BIM System Using Fuzzy Logic

Authors: Mohammad Reza Zamanian

Abstract:

The construction industry has always been recognized as one of the most dangerous available industries, and the statistics of accidents and injuries resulting from it say that the safety category needs more attention and the arrival of up-to-date technologies in this field. Building information modeling (BIM) is one of the relatively new and applicable technologies in Iran, that the necessity of using it is increasingly evident. The main purposes of this research are to evaluate the feasibility of using this technology in the safety sector of construction projects and to evaluate the effectiveness and operationality of its various applications in this sector. These applications were collected and categorized after reviewing past studies and researches then a questionnaire based on Delphi method criteria was presented to 30 experts who were thoroughly familiar with modeling software and safety guidelines. After receiving and exporting the answers to SPSS software, the validity and reliability of the questionnaire were assessed to evaluate the measuring tools. Fuzzy logic is a good way to analyze data because of its flexibility in dealing with ambiguity and uncertainty issues, and the implementation of the Delphi method in the fuzzy environment overcomes the uncertainties in decision making. Therefore, this method was used for data analysis, and the results indicate the usefulness and effectiveness of BIM in projects and improvement of safety status at different stages of construction. Finally, the applications and the sections discussed were ranked in order of priority for efficiency and effectiveness. Safety planning is considered as the most influential part of the safety of BIM among the four sectors discussed, and planning for the installation of protective fences and barriers to prevent falls and site layout planning with a safety approach based on a 3D model are the most important applications of BIM among the 18 applications to improve the safety of construction projects.

Keywords: building information modeling, safety of construction projects, Delphi method, fuzzy logic

Procedia PDF Downloads 155
457 Comparing Field Displacement History with Numerical Results to Estimate Geotechnical Parameters: Case Study of Arash-Esfandiar-Niayesh under Passing Tunnel, 2.5 Traffic Lane Tunnel, Tehran, Iran

Authors: A. Golshani, M. Gharizade Varnusefaderani, S. Majidian

Abstract:

Underground structures are of those structures that have uncertainty in design procedures. That is due to the complexity of soil condition around. Under passing tunnels are also such affected structures. Despite geotechnical site investigations, lots of uncertainties exist in soil properties due to unknown events. As results, it possibly causes conflicting settlements in numerical analysis with recorded values in the project. This paper aims to report a case study on a specific under passing tunnel constructed by New Austrian Tunnelling Method in Iran. The intended tunnel has an overburden of about 11.3m, the height of 12.2m and, the width of 14.4m with 2.5 traffic lane. The numerical modeling was developed by a 2D finite element program (PLAXIS Version 8). Comparing displacement histories at the ground surface during the entire installation of initial lining, the estimated surface settlement was about four times the field recorded one, which indicates that some local unknown events affect that value. Also, the displacement ratios were in a big difference between the numerical and field data. Consequently, running several numerical back analyses using laboratory and field tests data, the geotechnical parameters were accurately revised to match with the obtained monitoring data. Finally, it was found that usually the values of soil parameters are conservatively low-estimated up to 40 percent by typical engineering judgment. Additionally, it could be attributed to inappropriate constitutive models applied for the specific soil condition.

Keywords: NATM, surface displacement history, numerical back-analysis, geotechnical parameters

Procedia PDF Downloads 183
456 Identification and Classification of Fiber-Fortified Semolina by Near-Infrared Spectroscopy (NIR)

Authors: Amanda T. Badaró, Douglas F. Barbin, Sofia T. Garcia, Maria Teresa P. S. Clerici, Amanda R. Ferreira

Abstract:

Food fortification is the intentional addition of a nutrient in a food matrix and has been widely used to overcome the lack of nutrients in the diet or increasing the nutritional value of food. Fortified food must meet the demand of the population, taking into account their habits and risks that these foods may cause. Wheat and its by-products, such as semolina, has been strongly indicated to be used as a food vehicle since it is widely consumed and used in the production of other foods. These products have been strategically used to add some nutrients, such as fibers. Methods of analysis and quantification of these kinds of components are destructive and require lengthy sample preparation and analysis. Therefore, the industry has searched for faster and less invasive methods, such as Near-Infrared Spectroscopy (NIR). NIR is a rapid and cost-effective method, however, it is based on indirect measurements, yielding high amount of data. Therefore, NIR spectroscopy requires calibration with mathematical and statistical tools (Chemometrics) to extract analytical information from the corresponding spectra, as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). PCA is well suited for NIR, once it can handle many spectra at a time and be used for non-supervised classification. Advantages of the PCA, which is also a data reduction technique, is that it reduces the data spectra to a smaller number of latent variables for further interpretation. On the other hand, LDA is a supervised method that searches the Canonical Variables (CV) with the maximum separation among different categories. In LDA, the first CV is the direction of maximum ratio between inter and intra-class variances. The present work used a portable infrared spectrometer (NIR) for identification and classification of pure and fiber-fortified semolina samples. The fiber was added to semolina in two different concentrations, and after the spectra acquisition, the data was used for PCA and LDA to identify and discriminate the samples. The results showed that NIR spectroscopy associate to PCA was very effective in identifying pure and fiber-fortified semolina. Additionally, the classification range of the samples using LDA was between 78.3% and 95% for calibration and 75% and 95% for cross-validation. Thus, after the multivariate analysis such as PCA and LDA, it was possible to verify that NIR associated to chemometric methods is able to identify and classify the different samples in a fast and non-destructive way.

Keywords: Chemometrics, fiber, linear discriminant analysis, near-infrared spectroscopy, principal component analysis, semolina

Procedia PDF Downloads 199
455 Liver Regeneration of Small in situ Injury

Authors: Ziwei Song, Junjun Fan, Jeremy Teo, Yang Yu, Yukun Ma, Jie Yan, Shupei Mo, Lisa Tucker-Kellogg, Peter So, Hanry Yu

Abstract:

Liver is the center of detoxification and exposed to toxic metabolites all the time. It is highly regenerative after injury, with the ability to restore even after 70% partial hepatectomy. Most of the previous studies were using hepatectomy as injury models for liver regeneration study. There is limited understanding of small-scale liver injury, which can be caused by either low dose drug consumption or hepatocyte routine metabolism. Although these small in situ injuries do not cause immediate symptoms, repeated injuries will lead to aberrant wound healing in liver. Therefore, the cellular dynamics during liver regeneration is critical for our understanding of liver regeneration mechanism. We aim to study the liver regeneration of small-scale in situ liver injury in transgenic mice labeling actin (Lifeact-GFP). Previous studies have been using sample sections and biopsies of liver, which lack real-time information. In order to trace every individual hepatocyte during the regeneration process, we have developed and optimized an intravital imaging system that allows in vivo imaging of mouse liver for consecutive 5 days, allowing real-time cellular tracking and quantification of hepatocytes. We used femtosecond-laser ablation to make controlled and repeatable liver injury model, which mimics the real-life small in situ liver injury. This injury model is the first case of its kind for in vivo study on liver. We found that small-scale in situ liver injury is repaired by the coordination of hypertrophy and migration of hepatocytes. Hypertrophy is only transient at initial phase, while migration is the main driving force to complete the regeneration process. From cellular aspect, Akt/mTOR pathway is activated immediately after injury, which leads to transient hepatocyte hypertrophy. From mechano-sensing aspect, the actin cable, formed at apical surface of wound proximal hepatocytes, provides mechanical tension for hepatocyte migration. This study provides important information on both chemical and mechanical signals that promote liver regeneration of small in situ injury. We conclude that hypertrophy and migration play a dominant role at different stages of liver regeneration.

Keywords: hepatocyte, hypertrophy, intravital imaging, liver regeneration, migration

Procedia PDF Downloads 199
454 Time Driven Activity Based Costing Capability to Improve Logistics Performance: Application in Manufacturing Context

Authors: Siham Rahoui, Amr Mahfouz, Amr Arisha

Abstract:

In a highly competitive environment characterised by uncertainty and disruptions, such as the recent COVID-19 outbreak, supply chains (SC) face the challenge of maintaining their cost at minimum levels while continuing to provide customers with high-quality products and services. More importantly, businesses in such an economic context strive to maintain survival by keeping the cost of undertaken activities (such as logistics) low and in-house. To do so, managers need to understand the costs associated with different products and services in order to have a clear vision of the SC performance, maintain profitability levels, and make strategic decisions. In this context, SC literature explored different costing models that sought to determine the costs of undertaking supply chain-related activities. While some cost accounting techniques have been extensively explored in the SC context, more contributions are needed to explore the potential of time driven activity-based costing (TDABC). More specifically, more applications are needed in the manufacturing context of the SC, where the debate is ongoing. The aim of the study is to assess the capability of the technique to assess the operational performance of the logistics function. Through a case study methodology applied to a manufacturing company operating in the automotive industry, TDABC evaluates the efficiency of the current configuration and its logistics processes. The study shows that monitoring the process efficiency and cost efficiency leads to strategic decisions that contributed to improve the overall efficiency of the logistics processes.

Keywords: efficiency, operational performance, supply chain costing, time driven activity based costing

Procedia PDF Downloads 145
453 The Shannon Entropy and Multifractional Markets

Authors: Massimiliano Frezza, Sergio Bianchi, Augusto Pianese

Abstract:

Introduced by Shannon in 1948 in the field of information theory as the average rate at which information is produced by a stochastic set of data, the concept of entropy has gained much attention as a measure of uncertainty and unpredictability associated with a dynamical system, eventually depicted by a stochastic process. In particular, the Shannon entropy measures the degree of order/disorder of a given signal and provides useful information about the underlying dynamical process. It has found widespread application in a variety of fields, such as, for example, cryptography, statistical physics and finance. In this regard, many contributions have employed different measures of entropy in an attempt to characterize the financial time series in terms of market efficiency, market crashes and/or financial crises. The Shannon entropy has also been considered as a measure of the risk of a portfolio or as a tool in asset pricing. This work investigates the theoretical link between the Shannon entropy and the multifractional Brownian motion (mBm), stochastic process which recently is the focus of a renewed interest in finance as a driving model of stochastic volatility. In particular, after exploring the current state of research in this area and highlighting some of the key results and open questions that remain, we show a well-defined relationship between the Shannon (log)entropy and the memory function H(t) of the mBm. In details, we allow both the length of time series and time scale to change over analysis to study how the relation modify itself. On the one hand, applications are developed after generating surrogates of mBm trajectories based on different memory functions; on the other hand, an empirical analysis of several international stock indexes, which confirms the previous results, concludes the work.

Keywords: Shannon entropy, multifractional Brownian motion, Hurst–Holder exponent, stock indexes

Procedia PDF Downloads 98
452 Carbohydrate-Based Recommendations as a Basis for Dietary Guidelines

Authors: A. E. Buyken, D. J. Mela, P. Dussort, I. T. Johnson, I. A. Macdonald, A. Piekarz, J. D. Stowell, F. Brouns

Abstract:

Recently a number of renewed dietary guidelines have been published by various health authorities. The aim of the present work was 1) to review the processes (systematic approach/review, inclusion of public consultation) and methodological approaches used to identify and select the underpinning evidence base for the established recommendations for total carbohydrate (CHO), fiber and sugar consumption, and 2) examine how differences in the methods and processes applied may have influenced the final recommendations. A search of WHO, US, Canada, Australia and European sources identified 13 authoritative dietary guidelines with the desired detailed information. Each of these guidelines was evaluated for its scientific basis (types and grading of the evidence) and the processes by which the guidelines were developed Based on the data retrieved the following conclusions can be drawn: 1) Generally, a relatively high total CHO and fiber intake and limited intake of sugars (added or free) is recommended. 2) Even where recommendations are quite similar, the specific, justifications for quantitative/qualitative recommendations differ across authorities. 3) Differences appear to be due to inconsistencies in underlying definitions of CHO exposure and in the concurrent appraisal of CHO-providing foods and nutrients as well the choice and number of health outcomes selected for the evidence appraisal. 4) Differences in the selected articles, time frames or data aggregation method appeared to be of rather minor influence. From this assessment, the main recommendations are for: 1) more explicit quantitative justifications for numerical guidelines and communication of uncertainty; and 2) greater international harmonization, particularly with regard to underlying definitions of exposures and range of relevant nutrition-related outcomes.

Keywords: carbohydrates, dietary fibres, dietary guidelines, recommendations, sugars

Procedia PDF Downloads 247
451 The Spread of Drugs in Higher Education

Authors: Wantana Amatariyakul, Chumnong Amatariyakul

Abstract:

The research aims to examine the spread of drugs in higher education, especially amphetamine which is rapidly increasing in Thai society, its causes and effects, including the sociological perspective, in order to explain, prevent, control, and solve the problems. The students who participated in this research are regular students of Rajamangala University of Technology Isan, Khon Kaen Campus. The data were collected using questionnaires, group discussions, and in-depth interviews. The quantity data were analyzed using frequency, percentage, mean and standard deviation and using content analysis to analyzed quality data. The result of the study showed that the students had the results of examination on level of knowledge and understanding on drug abuse projected that the majority of sample group attained their knowledge on drug abuse respectively. Despite their uncertainty, the majority of samples presumed that amphetamine, marijuana and grathom (Mitragyna Speciosa Korth) would most likely be abused. The reason for first drug abuse is because they want to try and their friends convince them, as well as, they want to relax or solve the problems in life, respectively. The bad effects appearing to the drug addicts shows that their health deteriorates or worsens, as well as, they not only lose their money but also face with worse mental states. The reasons that respondents tried to avoid using drugs or refused drugs offered by friends were: not wanting to disappoint or upset their family members, fear of rejection by family members, afraid of being arrested by the police, afraid of losing their educational opportunity and ruining their future respectively. Students therefore defended themselves against drug addiction by refusing to try all drugs. Besides this, the knowledge about the danger and the harm of drugs persuaded them to stay away from drugs.

Keywords: drugs, higher education, drug addiction, spread of drugs

Procedia PDF Downloads 307
450 Toxicity of PPCPs on Adapted Sludge Community

Authors: G. Amariei, K. Boltes, R. Rosal, P. Leton

Abstract:

Wastewater treatment plants (WWTPs) are supposed to hold an important place in the reduction of emerging contaminants, but provide an environment that has potential for the development and/or spread of adaptation, as bacteria are continuously mixed with contaminants at sub-inhibitory concentrations. Reviewing the literature, there are little data available regarding the use of adapted bacteria forming activated sludge community for toxicity assessment, and only individual validations have been performed. Therefore, the aim of this work was to study the toxicity of Triclosan (TCS) and Ibuprofen (IBU), individually and in binary combination, on adapted activated sludge (AS). For this purpose a battery of biomarkers were assessed, involving oxidative stress and cytotoxicity responses: glutation-S-transferase (GST), catalase (CAT) and viable cells with FDA. In addition, we compared the toxic effects on adapted bacteria with unadapted bacteria, from a previous research. Adapted AS comes from three continuous-flow AS laboratory systems; two systems received IBU and TCS, individually; while the other received the binary combination, for 14 days. After adaptation, each bacterial culture condition was exposure to IBU, TCS and the combination, at 12 h. The concentration of IBU and TCS ranged 0.5-4mg/L and 0.012-0.1 mg/L, respectively. Batch toxicity experiments were performed using Oxygraph system (Hansatech), for determining the activity of CAT enzyme based on the quantification of oxygen production rate. Fluorimetric technique was applied as well, using a Fluoroskan Ascent Fl (Thermo) for determining the activity of GST enzyme, using monochlorobimane-GSH as substrate, and to the estimation of viable cell of the sludge, by fluorescence staining using Fluorescein Diacetate (FDA). For IBU adapted sludge, CAT activity it was increased at low concentration of IBU, TCS and mixture. However, increasing the concentration the behavior was different: while IBU tends to stabilize the CAT activity, TCS and the mixture decreased this one. GST activity was significantly increased by TCS and mixture. For IBU, no variations it was observed. For TCS adapted sludge, no significant variations on CAT activity it was observed. GST activity it was significant decreased for all contaminants. For mixture adapted sludge the behaviour of CAT activity it was similar to IBU adapted sludge. GST activity it was decreased at all concentration of IBU. While the presence of TCS and mixture, respectively, increased the GST activity. These findings were consistent with the viability cells evaluation, which clearly showed a variation of sludge viability. Our results suggest that, compared with unadapted bacteria, the adapted bacteria conditions plays a relevant role in the toxicity behaviour towards activated sludge communities.

Keywords: adapted sludge community, mixture, PPCPs, toxicity

Procedia PDF Downloads 390
449 Qualitative Case Studies in Reading Specialist Education

Authors: Carol Leroy

Abstract:

This presentation focuses on the analysis qualitative case studies in the graduate education of reading specialists. The presentation describes the development and application of an integrated conceptual framework for reading specialist education, drawing on Robert Stake’s work on case study research, Kenneth Zeichner’s work on professional learning, and various tools for reading assessment (e.g. the Qualitative Reading Inventory). Social constructivist theory is used to provide intersecting links between the various influences on the processes used to assess and teaching reading within the case study framework. Illustrative examples are described to show the application of the framework in reading specialist education in a teaching clinic at a large urban university. Central to education of reading specialists in this teaching clinic is the collection, analysis and interpretation of data for the design and implementation of reading and writing programs for struggling readers and writers. The case study process involves the integrated interpretation of data, which is central to qualitative case study inquiry. An emerging theme in this approach to graduate education is the ambiguity and uncertainty that governs work with the adults and children who attend the clinic for assistance. Tensions and contradictions are explored insofar as they reveal overlapping but intersecting frameworks for case study analysis in the area of literacy education. An additional theme is the interplay of multiple layers of data with a resulting depth that goes beyond the practical need of the client and toward the deeper pedagogical growth of the reading specialist. The presentation makes a case for the value of qualitative case studies in reading specialist education. Further, the use of social constructivism as a unifying paradigm provides a robustness to the conceptual framework as a tool for understanding the pedagogy that is involved.

Keywords: assessment, case study, professional education, reading

Procedia PDF Downloads 443
448 Digital Platforms: Creating Value through Network Effects under Pandemic Conditions

Authors: S. Łęgowik-Świącik

Abstract:

This article is a contribution to the research into the determinants of value creation via digital platforms in variable operating conditions. The dynamics of the market environment caused by the COVID-19 pandemic have made enterprises built on digital platforms financially successful. While many classic companies are struggling with the uncertainty of conducting a business and difficulties in the process of value creation, digital platforms create value by modifying the existing business model to meet the changing needs of customers. Therefore, the objective of this publication is to understand and explain the relationship between value creation and the conversion of the business model built on digital platforms under pandemic conditions. The considerations relating to the conceptual framework and determining the research objective allowed for adopting the hypothesis, assuming that the processes of value creation are evolving, and the measurement of these processes allows for the protection of value created and enables its growth in changing circumstances. The research methods, such as critical literature analysis and case study, were applied to accomplish the objective pursued and verify the hypothesis formulated. The empirical research was carried out based on the data from enterprises listed on the Nasdaq Stock Exchange: Amazon, Alibaba, and Facebook. The research period was the years 2018-2021. The surveyed enterprises were chosen based on the targeted selection. The problem discussed is important and current since the lack of in-depth theoretical research results in few attempts to identify the determinants of value creation via digital platforms. The above arguments led to an attempt at theoretical analysis and empirical research to fill in the gap perceived by deepening the understanding of the process of value creation through network effects via digital platforms under pandemic conditions.

Keywords: business model, digital platforms, enterprise management, pandemic conditions, value creation process

Procedia PDF Downloads 117
447 Risk Issues for Controlling Floods through Unsafe, Dual Purpose, Gated Dams

Authors: Gregory Michael McMahon

Abstract:

Risk management for the purposes of minimizing the damages from the operations of dams has met with opposition emerging from organisations and authorities, and their practitioners. It appears that the cause may be a misunderstanding of risk management arising from exchanges that mix deterministic thinking with risk-centric thinking and that do not separate uncertainty from reliability and accuracy from probability. This paper sets out those misunderstandings that arose from dam operations at Wivenhoe in 2011, using a comparison of outcomes that have been based on the methodology and its rules and those that have been operated by applying misunderstandings of the rules. The paper addresses the performance of one risk-centric Flood Manual for Wivenhoe Dam in achieving a risk management outcome. A mixture of engineering, administrative, and legal factors appear to have combined to reduce the outcomes from the risk approach. These are described. The findings are that a risk-centric Manual may need to assist administrations in the conduct of scenario training regimes, in responding to healthy audit reporting, and in the development of decision-support systems. The principal assistance needed from the Manual, however, is to assist engineering and the law to a good understanding of how risks are managed – do not assume that risk management is understood. The wider findings are that the critical profession for decision-making downstream of the meteorologist is not dam engineering or hydrology, or hydraulics; it is risk management. Risk management will provide the minimum flood damage outcome where actual rainfalls match or exceed forecasts of rainfalls, that therefore risk management will provide the best approach for the likely history of flooding in the life of a dam, and provisions made for worst cases may be state of the art in risk management. The principal conclusion is the need for training in both risk management as a discipline and also in the application of risk management rules to particular dam operational scenarios.

Keywords: risk management, flood control, dam operations, deterministic thinking

Procedia PDF Downloads 71
446 Effect of Islamic Finance on Jobs Generation in Punjab, Pakistan

Authors: B. Ashraf, A. M. Malik

Abstract:

The study was accomplished at the Department of Economics and Agriculture Economics, Pir Mahar Ali Shah ARID Agriculture University, Punjab, Pakistan during 2013-16 with a purpose to discover the effect of Islamic finance/banking on employment in Punjab, Pakistan. Islamic banking system is sub-component of conventional banking system in various countries of the world; however, in Pakistan, it has been established as a separate Islamic banking system. The Islamic banking operates under the doctrine of Shariah. It is claimed that the referred banking is free of interest (Riba) and addresses the philosophy and basic values of Islam in finance that reduces the factors of uncertainty, risk and others speculative activities. Two Islamic bank’s; Meezan Bank Limited (Pakistan) and Al-Baraka Bank Limited (Pakistan) from North Punjab (Bahawalnagar) and central Punjab (Lahore) west Punjab (Gujrat), Pakistan were randomly selected for the conduct of research. A total of 206 samples were collected from the define areas and banks through questionnaire. The data was analyzed by using the Statistical Package for Social Sciences (SPSS) version 21.0. Multiple linear regressions were applied to prove the hypothesis. The results revealed that the assets formation had significant positive; whereas, the technology, length of business (experience) and bossiness size had significant negative impact with employment generation in Islamic finance/banking in Punjab, Pakistan. This concludes that the employment opportunities may be created in the country by extending the finance to business/firms to start new business and increase the Public awareness by the Islamic banks through intensive publicity. However; Islamic financial institutions may be encouraged by Government as it enhances the employment in the country.

Keywords: assets formation, borrowers, employment generation, Islamic banks, Islamic finance

Procedia PDF Downloads 308
445 Food Foam Characterization: Rheology, Texture and Microstructure Studies

Authors: Rutuja Upadhyay, Anurag Mehra

Abstract:

Solid food foams/cellular foods are colloidal systems which impart structure, texture and mouthfeel to many food products such as bread, cakes, ice-cream, meringues, etc. Their heterogeneous morphology makes the quantification of structure/mechanical relationships complex. The porous structure of solid food foams is highly influenced by the processing conditions, ingredient composition, and their interactions. Sensory perceptions of food foams are dependent on bubble size, shape, orientation, quantity and distribution and determines the texture of foamed foods. The state and structure of the solid matrix control the deformation behavior of the food, such as elasticity/plasticity or fracture, which in turn has an effect on the force-deformation curves. The obvious step in obtaining the relationship between the mechanical properties and the porous structure is to quantify them simultaneously. Here, we attempt to research food foams such as bread dough, baked bread and steamed rice cakes to determine the link between ingredients and the corresponding effect of each of them on the rheology, microstructure, bubble size and texture of the final product. Dynamic rheometry (SAOS), confocal laser scanning microscopy, flatbed scanning, image analysis and texture profile analysis (TPA) has been used to characterize the foods studied. In all the above systems, there was a common observation that when the mean bubble diameter is smaller, the product becomes harder as evidenced by the increase in storage and loss modulus (G′, G″), whereas when the mean bubble diameter is large the product is softer with decrease in moduli values (G′, G″). Also, the bubble size distribution affects texture of foods. It was found that bread doughs with hydrocolloids (xanthan gum, alginate) aid a more uniform bubble size distribution. Bread baking experiments were done to study the rheological changes and mechanisms involved in the structural transition of dough to crumb. Steamed rice cakes with xanthan gum (XG) addition at 0.1% concentration resulted in lower hardness with a narrower pore size distribution and larger mean pore diameter. Thus, control of bubble size could be an important parameter defining final food texture.

Keywords: food foams, rheology, microstructure, texture

Procedia PDF Downloads 320