We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Steps to reproduce:
firedrake-install . firedrake/bin/activate git clone [email protected]:thetisproject/thetis pip install -e thetis pip install colorama pip install scipy cd thetis/test/swe2d firedrake-clean py.test test_steady_state_channel_mms.py -x -v -s ============================= test session starts ============================== platform darwin -- Python 2.7.11, pytest-2.9.1, py-1.4.31, pluggy-0.3.1 -- /Users/lmitche1/Documents/work/src/fd-install/firedrake/bin/python2.7 cachedir: ../../.cache benchmark: 3.0.0 (defaults: timer=time.time disable_gc=False min_rounds=5 min_time=5.00us max_time=1.00s calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /Users/lmitche1/Documents/work/src/fd-install/thetis, inifile: plugins: benchmark-3.0.0, xdist-1.14 collecting 0 items-------------------------------------------------------------------------- Petsc Development GIT revision: v3.4.2-12173-g511e85b GIT Date: 2016-04-07 00:39:56 -0500 The PETSc Team [email protected] http://www.mcs.anl.gov/petsc/ See docs/changes/index.html for recent updates. See docs/faq.html for problems. See docs/manualpages/index.html for help. Libraries linked from /Users/lmitche1/Documents/work/src/fd-install/firedrake/lib/python2.7/site-packages/petsc/lib -------------------------------------------------------------------------- collected 2 items test_steady_state_channel_mms.py::test_steady_state_channel_mms[mimetic] COFFEE finished in 0.00221896 seconds (flops: 0 -> 0) COFFEE finished in 0.00326109 seconds (flops: 0 -> 0) compute_form_data finished in 0.024508 seconds. compile_integral finished in 0.0944471 seconds. TSFC finished in 0.119004 seconds. COFFEE finished in 0.0517299 seconds (flops: 4340 -> 1316) COFFEE finished in 0.00372005 seconds (flops: 0 -> 0) compute_form_data finished in 0.0209169 seconds. compile_integral finished in 0.1691 seconds. TSFC finished in 0.190075 seconds. COFFEE finished in 0.0528538 seconds (flops: 1757 -> 749) COFFEE finished in 0.00235796 seconds (flops: 0 -> 0) COFFEE finished in 0.00304103 seconds (flops: 0 -> 0) compute_form_data finished in 0.0181992 seconds. compile_integral finished in 0.0292919 seconds. TSFC finished in 0.047559 seconds. COFFEE finished in 0.00383091 seconds (flops: 27 -> 27) compute_form_data finished in 0.0216379 seconds. compile_integral finished in 0.0289121 seconds. TSFC finished in 0.0506079 seconds. COFFEE finished in 0.0046041 seconds (flops: 27 -> 27) compute_form_data finished in 0.019644 seconds. compile_integral finished in 0.028259 seconds. TSFC finished in 0.047955 seconds. COFFEE finished in 0.0049181 seconds (flops: 27 -> 27) compute_form_data finished in 0.022316 seconds. compile_integral finished in 0.0296991 seconds. TSFC finished in 0.052078 seconds. COFFEE finished in 0.00385499 seconds (flops: 27 -> 27) COFFEE finished in 0.00276399 seconds (flops: 0 -> 0) COFFEE finished in 0.00211 seconds (flops: 0 -> 0) dt = 10.0 compute_form_data finished in 0.872862 seconds. compile_integral finished in 0.323892 seconds. compile_integral finished in 0.125879 seconds. compile_integral finished in 0.120333 seconds. compile_integral finished in 0.112606 seconds. compile_integral finished in 0.112715 seconds. compile_integral finished in 0.303551 seconds. TSFC finished in 1.97202 seconds. COFFEE finished in 0.312116 seconds (flops: 21956 -> 10336) COFFEE finished in 0.270186 seconds (flops: 18129 -> 7332) COFFEE finished in 0.314766 seconds (flops: 18899 -> 12557) COFFEE finished in 0.291351 seconds (flops: 6137 -> 4354) COFFEE finished in 0.303944 seconds (flops: 6137 -> 4354) FAILED =================================== FAILURES =================================== ____________________ test_steady_state_channel_mms[mimetic] ____________________ options = {'no_exports': True} @pytest.mark.parametrize("options", [ {"no_exports": True}, # default: mimetic {"no_exports": True, "mimetic": False, "continuous_pressure": True}, ], ids=["mimetic", "pndp(n+1)"]) def test_steady_state_channel_mms(options): lx = 5e3 ly = 1e3 order = 1 # minimum resolution min_cells = 16 n = 1 # number of timesteps dt = 10. g = physical_constants['g_grav'].dat.data[0] h0 = 10. # depth at rest area = lx*ly k = 4.0*math.pi/lx q = h0*1.0 # flux (depth-integrated velocity) eta0 = 1.0 # free surface amplitude eta_expr = Expression("eta0*cos(k*x[0])", k=k, eta0=eta0) depth_expr = "H0+eta0*cos(k*x[0])" u_expr = Expression(("Q/({H})".format(H=depth_expr), 0.), k=k, Q=q, eta0=eta0, H0=h0) # source_expr = Expression("k*eta0*(pow(Q,2)/pow({H},3)-g)*sin(k*x[0])".format(H=depth_expr), # k=k, g=g, Q=q, eta0=eta0, H0=h0) source_expr = Expression(("k*eta0*(pow(Q,2)/pow({H},3)-g)*sin(k*x[0])".format(H=depth_expr), 0), k=k, g=g, Q=q, eta0=eta0, H0=h0) u_bcval = q/(h0+eta0) eta_bcval = eta0 do_exports = not options['no_exports'] if do_exports: diff_pvd = File('diff.pvd') udiff_pvd = File('udiff.pvd') source_pvd = File('source.pvd') eta_errs = [] u_errs = [] for i in range(5): mesh2d = RectangleMesh(min_cells*2**i, 1, lx, ly) # bathymetry p1_2d = FunctionSpace(mesh2d, 'CG', 1) bathymetry_2d = Function(p1_2d, name="bathymetry") bathymetry_2d.assign(h0) source_space = VectorFunctionSpace(mesh2d, 'DG', order+1) source_func = project(source_expr, source_space) # --- create solver --- solver_obj = solver2d.FlowSolver2d(mesh2d, bathymetry_2d, order=order) solver_obj.options.nonlin = True solver_obj.options.t_export = dt solver_obj.options.t_end = n*dt solver_obj.options.uv_source_2d = source_func solver_obj.options.timestepper_type = 'cranknicolson' solver_obj.options.timer_labels = [] solver_obj.options.dt = dt solver_obj.options.update(options) # boundary conditions inflow_tag = 1 outflow_tag = 2 inflow_func = Function(p1_2d) inflow_func.interpolate(Expression(-u_bcval)) inflow_bc = {'un': inflow_func} outflow_func = Function(p1_2d) outflow_func.interpolate(Expression(eta_bcval)) outflow_bc = {'elev': outflow_func} solver_obj.bnd_functions['shallow_water'] = {inflow_tag: inflow_bc, outflow_tag: outflow_bc} # parameters['quadrature_degree']=5 > solver_obj.create_equations() test_steady_state_channel_mms.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../thetis/solver2d.py:143: in create_equations semi_implicit=self.options.use_linearized_semi_implicit_2d) ../../thetis/timeintegrator.py:503: in __init__ self.update_solver() ../../thetis/timeintegrator.py:510: in update_solver options_prefix=self.name) ../../../firedrake/src/firedrake/firedrake/variational_solver.py:125: in __init__ ctx = solving_utils._SNESContext(problem) ../../../firedrake/src/firedrake/firedrake/solving_utils.py:106: in __init__ for problem in problems) ../../../firedrake/src/firedrake/firedrake/solving_utils.py:106: in <genexpr> for problem in problems) ../../../firedrake/src/firedrake/firedrake/assemble.py:66: in assemble inverse=inverse, nest=nest) <decorator-gen-295>:2: in _assemble ??? ../../../firedrake/src/firedrake/firedrake/utils.py:62: in wrapper return f(*args, **kwargs) ../../../firedrake/src/firedrake/firedrake/assemble.py:100: in _assemble inverse=inverse) ../../../firedrake/src/firedrake/firedrake/tsfc_interface.py:243: in compile_form number_map).kernels ../../../firedrake/src/PyOP2/pyop2/caching.py:203: in __new__ obj = make_obj() ../../../firedrake/src/PyOP2/pyop2/caching.py:193: in make_obj obj.__init__(*args, **kwargs) ../../../firedrake/src/firedrake/firedrake/tsfc_interface.py:174: in __init__ kernels.append(KernelInfo(kernel=Kernel(ast, ast.name, opts=opts), ../../../firedrake/src/PyOP2/pyop2/backends.py:118: in __call__ return t(*args, **kwargs) ../../../firedrake/src/PyOP2/pyop2/caching.py:203: in __new__ obj = make_obj() ../../../firedrake/src/PyOP2/pyop2/caching.py:193: in make_obj obj.__init__(*args, **kwargs) ../../../firedrake/src/PyOP2/pyop2/base.py:3848: in __init__ self._code = self._ast_to_c(self._ast, self._opts) ../../../firedrake/src/PyOP2/pyop2/host.py:60: in _ast_to_c ast_handler.plan_cpu(self._opts) ../../../firedrake/src/COFFEE/coffee/plan.py:184: in plan_cpu _generate_cpu_code(self, kernel, **params) ../../../firedrake/src/COFFEE/coffee/plan.py:106: in _generate_cpu_code loop_opt.rewrite(rewrite) ../../../firedrake/src/COFFEE/coffee/optimizer.py:116: in rewrite ew.SGrewrite() ../../../firedrake/src/COFFEE/coffee/rewriter.py:454: in SGrewrite sgraph, mapper = sg_visitor.visit(self.stmt.rvalue) ../../../firedrake/src/COFFEE/coffee/visitor.py:106: in visit return meth(o, *args, **kwargs) ../../../firedrake/src/COFFEE/coffee/visitors/utilities.py:455: in visit_Sum ret = self.visit(op, ret=ret, syms=loc_syms[i], parent=o) ../../../firedrake/src/COFFEE/coffee/visitor.py:106: in visit return meth(o, *args, **kwargs) ../../../firedrake/src/COFFEE/coffee/visitors/utilities.py:439: in visit_Prod G.add_edges_from(loc_syms) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <networkx.classes.graph.Graph object at 0x1133d1bd0> ebunch = [(('t1', ('facet[0]', 'ip', 'j'), ((1, 0), (1, 0), (1, 0))), ('t1', ('facet[1]', 'ip', 'k'), ((1, 0), (1, 0), (1, 0)))...]', 'ip', 'j'), ((1, 0), (1, 0), (1, 0))), ('t37', (), ()), ('t0', ('facet[1]', 'ip', 'k'), ((1, 0), (1, 0), (1, 0))))] attr_dict = {}, attr = {} e = (('t1', ('facet[0]', 'ip', 'j'), ((1, 0), (1, 0), (1, 0))), ('t37', (), ()), ('t1', ('facet[1]', 'ip', 'k'), ((1, 0), (1, 0), (1, 0)))) ne = 3, u = ('t1', ('facet[0]', 'ip', 'j'), ((1, 0), (1, 0), (1, 0))) v = ('t37', (), ()) dd = ('t1', ('facet[1]', 'ip', 'k'), ((1, 0), (1, 0), (1, 0))) datadict = {'t': '1'} def add_edges_from(self, ebunch, attr_dict=None, **attr): """Add all the edges in ebunch. Parameters ---------- ebunch : container of edges Each edge given in the container will be added to the graph. The edges must be given as as 2-tuples (u,v) or 3-tuples (u,v,d) where d is a dictionary containing edge data. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with each edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edge : add a single edge add_weighted_edges_from : convenient way to add weighted edges Notes ----- Adding the same edge twice has no effect but any edge data will be updated when each duplicate edge is added. Edge attributes specified in edges take precedence over attributes specified generally. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edges_from([(0,1),(1,2)]) # using a list of edge tuples >>> e = zip(range(0,3),range(1,4)) >>> G.add_edges_from(e) # Add the path graph 0-1-2-3 Associate data to edges >>> G.add_edges_from([(1,2),(2,3)], weight=3) >>> G.add_edges_from([(3,4),(1,4)], label='WN2898') """ # set up attribute dict if attr_dict is None: attr_dict = attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError( "The attr_dict argument must be a dictionary.") # process ebunch for e in ebunch: ne = len(e) if ne == 3: u, v, dd = e elif ne == 2: u, v = e dd = {} # doesnt need edge_attr_dict_factory else: raise NetworkXError( "Edge tuple %s must be a 2-tuple or 3-tuple." % (e,)) if u not in self.node: self.adj[u] = self.adjlist_dict_factory() self.node[u] = {} if v not in self.node: self.adj[v] = self.adjlist_dict_factory() self.node[v] = {} datadict = self.adj[u].get(v, self.edge_attr_dict_factory()) datadict.update(attr_dict) > datadict.update(dd) E ValueError: dictionary update sequence element #1 has length 3; 2 is required ../../../firedrake/lib/python2.7/site-packages/networkx/classes/graph.py:874: ValueError
The text was updated successfully, but these errors were encountered:
This occurs both on my mac and on a linux box.
Sorry, something went wrong.
Successfully merging a pull request may close this issue.
Steps to reproduce:
The text was updated successfully, but these errors were encountered: